Hacker News new | past | comments | ask | show | jobs | submit login
The End of OS X (stratechery.com)
375 points by Amorymeltzer 10 days ago | hide | past | web | favorite | 573 comments





I've looked at the trajectory of Apple these past few years with mounting bitter-sweetness. At its best, MacOS really felt like the best of Unix combined with the consumer focus alluded to in the article. I still adore the craftsmanship of the latest Macbooks, but my 2015 Macbook felt like the apex of Mac's design. It was the kind of device that was such a pleasure to develop on. You could leverage the power of Unix in a UI designed with thoughtfulness and care. It reminded you that power and accessibility to end-users can be balanced. And now that balance has tipped away from the power-user, the developers, to the end-user, and in doing so I question my commitment to this extraordinary ecosystem. Why develop on a platform that contends with me every step of the way? Because the users and the money are there? That was never what brought me to the Macbook.

EDIT: I suppose most users in the Apple ecosystem aren't on Macbooks, but iPhones.


I keep reading these types of comments on HN but I just can't relate. I consider myself a power user, and precisely because MacOS feels like the best of Unix combined with customer focus is why I'm wholly invested in the Apple ecosystem. I simply can't relate to the statement that Mac is a "platform that contends with me every step of the way" and I'm curious to know more specifically why you feel this way?

I just "use linux" and I get all the benefits with much better hardware (more memory, disk storage, speed for compilation), better drivers (important especially for GPU support for either graphics/rendering work or machine learning) and my downsides are... what exactly? Itunes, Mail, etc, none of the software shipping with MacOS is actually any good. I don't use Safari. From your perspective, what aspect of the "customer focus" materially improves anything? ICloud? The thing that lost a ton of my data over the years? Safari? The thing that doesn't even support WebGL 2 or other standards that are a decade old? (seriously, safari is the new internet explorer).

> and my downsides are... what exactly?

I've spent zero days, zero hours, zero minutes, and zero seconds fiddling with setting anything up in macOSX over the last 15 years.

I manage a bunch of Linux systems, from laptops to desktop to servers, and over the last 15 years I've probably spent months of my life setting up stuff that should have just worked.

I use a mac at home and as my dev machine, because I charge quite a bit of money for doing that for Linux, and I don't want to throw my own money like that during my free time.

> Itunes, Mail, etc, none of the software shipping with MacOS is actually any good.

I use many Microsoft Office Apps, Adobe suite apps (mainly photoshop, lightroom and illustrator), and all of them just work.

VNC clients, VPN clients, Samba, they all just work. External devices, no issues. Do I want to stream my display over wifi to my television, or my audio to the stereo? Just one click away. Do I want to restore the back up of one directory from 2 years ago ? Its three clicks away.

On Linux, half the steps to each of these things are filled with hours and hours of research in the Arch wiki. Do you want to stream your monitor to your TV? Start by setting up a media server, and then 200 steps. Do you want to restore a partial backup of some folder in a BTRFs drive, a dozen steps that I cannot remember. Like no way. I don't want to remember how to do these things, and I don't want to have to look it up. If nobody is paying me to do these things it should be completely obvious how to do them and it should just work.

But hey, do convince your employers to use Linux everywhere, its great for the business.


I've been a nix guy since 1983. Worked for a major nix player, wrote end user application software on the platform, built chip design software on the platform, wrote device drivers for all kinds of hardware, system admin for years and years, etc. etc. I'm not some newb who just fell off the truck.

In many way *nix is still stuck in 1983 because it's a craftsman's platform, which is why there 10,000 distros, oops 10,001, no 10,002, oh 10,003, ugh! Linux is great on servers, because all servers require craftsman for the care and feeding, and Linux is so much better than Windows in that regard.

I gave up my Linux machines because I'm no longer interested in being my own sys admin. Frankly I was done being my own sys admin back in the 90's but I felt the need to keep my propeller spinning, and stay true to my geek roots.

In 2008 I bought a Mac Pro 8 core machine, and that's still kicking, my son uses it for Machine Learning as part of his graduate program, and it's just as fast now as when I first bought it. It was left behind with the last OS upgrade, but it will work just fine for another 4 or 5 years on the old OS.

In 2009 I still had a Linux laptop, but it was painful back then. With SnowLeopard it became easier to create an maintain a Hackintosh then it was to manage a Linux laptop I was off Linux for good.

For a hobbyist(craftsman) Linux is great, for a server that needs craftsman, Linux is great, for a consumer, buy a Mac.


And most server admin is moving to an “immutable infra” model where the Linux box is a container worker node that is disposable. Life’s too short to be sysadmining.

I went to a mix of Windows/macOS instead, but can fully relate to your story, as I had an UNIX zealot phase during my university years, and have been the UNIX/Windows porting guy at several projects.

My reason for giving up was reaching the conclusion that Linux distributions would never replicate the expectations of an Amiga/BeOS like experience for those that care about multimedia.


This is anecdotal. My own anecdote is that we've had far more trouble with my wife's MacBook than my NixOS ThinkPad which just works, even after upgrades.

The Linux experience depends on the distribution, software and hardware you choose. You need to choose carefully.


> you need to choose carefully.

So here’s the interesting part; can you recommend a distro that is universally the right choice for dev work?

If yes, please tell - I’d love to know!

If the answer is “well it depends a lot on your needs” that’s kinda the point of MacOS right? The fact that making the right distro choice requires serious thought is already problematic (hypothetically)


You do have a point, with Mac the choices have been made for you. I guess the most mac-like distro is probably Ubuntu. If you pick the LTS, you might even have a smoother ride than the macOS yearly updates.

> I've spent zero days, zero hours, zero minutes, and zero seconds fiddling with setting anything up in macOSX over the last 15 years.

I have to say that I 100% do not believe this statement.

You mean you you've never used any wireless connections to your machine? You've never had to set up backups or change audio settings? You've never had to deal with setting up one thing in 15 years? Since when did OSX gain the ability to read minds?


>You mean you you've never used any wireless connections to your machine? You've never had to set up backups or change audio settings? You've never had to deal with setting up one thing in 15 years? Since when did OSX gain the ability to read minds?

You seem to confuse using the machine with setting up.

Connecting a wireless connection (picking the SSID, entering the password, etc) is not "setting up". It's merely regularly using the computer.

The setting up we'd rather avoid, and that the parent talks about, would be tinkering to get your wifi chip working with your OS -- which you get the "pleasure" to do on Linux...


His claim is he did it 15 years ago, and never had to touch anything else since. Definitely a stretch, but quite believable. Just a few days ago I had to figure out how to reinstall Nvidia driver after Ubuntu 18.04 to 20.04 upgrade. And the damn thing still shows my primary screen panned whenever I log off/log back on.

I was glad to return to Windows when the Linux-specific issue was solved.


Isn't this YOUR problem you are using an infamously unsupported hardware with Linux? The problems would be same if you use Hackintosh on unsupported hardware too.

Somehow I need my GPGPU and there are currently no good alternatives.

I've got a stock Ubuntu 18.04 install at work. Sometimes, but not always, it hangs for three minutes or so trying to run fsck on my UEFI partition. Other times, I'm at the login screen within a minute of turning it on.

When it hangs, it finally exits with an error message saying it couldn't mount /boot/efi. Once I log in I can see that /boot/efi is mounted fine.

A bunch of my applications hang all the time for reasons I couldn't figure out. Turns out they were trying to communicate with my Gnome keychain over D-Bus, but they were in a separate D-Bus session so they just sent their D-Bus message and then waited for however long until they gave up. No idea at what point this started happening or why, but I never did figure out how to fix it (other than disabling those apps from being able to store passwords securely).

If I used the Intel drivers for my Intel GPU I got extreme stuttering and tearing, which only went away when I uninstalled the Intel drivers entirely so that it would use the "correct" drivers.

Then there's snaps. I install stuff and sometimes it's a package, and sometimes it's a snap. If it's a snap, everything turns into a giant mess. I had Firefox saving its downloads to Skype's workspace for some reason, multiple Firefox profiles on my system in various places, multiple versions of Firefox running when snapd downloaded a new version and symlinked it to be the default, two copies of Telegram for some reason and no obvious way to tell which was which, a snap and non-snap version of things because the Store installed the snap but another package depended on the apt package, and so on.

I installed my system on btrfs because "it's like a decade old now, it should be fine right?" and within a month my filesystem had been completely corrupted and I had to reinstall entirely (though I managed to salvage most of my home directory at least).

This is all stuff I'm willing to tolerate to some extent, but I can't think of a single thing that Linux offers me that makes it worth all this hassle. At my last job, I happily used a MacBook for nine straight years, and I never lost an entire day of productivity just because suddenly my system won't boot, or dealt with graphical glitches in every app because the driver for my video card isn't the right driver for my video card, or lost data because one of the default options for installation filesystem is unstable and unreliable.

I've never had to do much of anything to "tweak" MacOS unless I wanted to; on Ubuntu, I have to do it almost every day to get something working well (or at all).

So I'm gonna go ahead and say that yeah, MacOS isn't even remotely dead, and it's certainly not going to lose to Linux any time soon.


Thank you for providing all these perfect examples:

> You mean you you've never used any wireless connections to your machine?

Yes: I clicked on the network, typed the password, and it worked, always. If the machine already has seen the network, i don't even have to click, it just connects. If one of my machines (e.g. my iphone) has seen the network, the network and its credentials are stored on iCloud, so my laptop and my tablet can just connect to it. I never had to google how this works, never had to install anything to make this work, never had to change any configuration file to make this work, etc.. Same for VPNs, VNCs, corporate networks, network drives, etc.

> You've never had to set up backups or change audio settings?

Yes, I clicked on enable backups, clicked on the box that enables daily back ups, clicked on the drive where I wanted to store them, and it worked, always.

I can restore, browse, partially restore, backups with one click using time machine. I never had to google how this works. It always worked.

> change audio settings?

Yes, I clicked on the audio button, and volume worked, muting worked, all 4 sound outputs worked, always worked, etc.

---

On linux, over the last 15 years, i have googled at least for days to do _each_ of these things for customers:

- can't connect to a particular wlan, corporate wlan, etc. needed new drivers, upgrade kernel, change some etc config to enable something, etc. Same for network drives, cloud drives, SMB, ...

- sound, damn sound: sound cards not being detected (hell on my lenovo yoga an update to 19.04 Ubuntu pushed a kernel patch that broke my sound cards driver: no sound card detected, still not fixed on Ubuntu 20.04 5 months later - no sound in the times of corona and web meetings - great!), alsamixer, etc files, pulse audio, .... I have had to google a lot about each of these things, and fiddle a lot over the last 15 years, on dozens of machines, for dozens of users, for hours.

- backups, on Linux, have fun. I've set backups for ext4, zfs, and btrfs. ALWAYS had to google for hours. Every time I had to restore a backup, had to google for hours. Some times restoring did not work. Some times backups did not work. Encrypted backups do not work out of the box (Apple's full-disk encryption is enabled by default - but enabling it and having work with everything is just 1 click - on Linux full-disk encryption is a mess).

---

So no, I never had to set up anything on MacOS X in the last 15 years. The things were in the obvious place, and I just used them, and they just worked. No need to even google how to do the thing, or what to install to get it done, or configure the install, or read bug reports to work around issues, nothing. They just damn worked, as they should.

Its 2020, and setting things for companies using Linux is my day job. These people would all save a ton of money by just using macs or windows machines. The amount of time their employees waste on setting up and maintaining their machines instead of doing their actual work is astonishing.


What if you want to type without accidentally hitting your cartoonishly-large trackpad? Or if you want to type without a key accidentally registering twice because someone decided that thin keys were better?

Applications and hardware drivers on Mac just work. Even Homebrew works pretty nicely. The visuals of Mac are vastly superior to any flavor of Linux I've tried: smooth graphics, font rendering, HiDPI support, etc. Some aspects of Mac hardware (screen, touchpad) are very hard to come by with other manufacturers.

At this point I'll probably switch to Linux as soon as my 2015 Macbook stops working, but I would've really liked to stay with Apple.


> Even Homebrew works pretty nicely

It always surprises me when people cite homebrew as a plus rather than a minus in these discussions. Compared to the experience I have with Linux package managers, I've found homebrew to be extraordinarily slow even at the best of times. A simple "brew update" will generally take at least 30 seconds; in that time on something like pacman, I'll usually not only have already finished fetching the list of updates but also actually performing the upgrades, and that's updating my entire system, not just a few userspace packages that I have installed. I can understand that some people prefer MacOS and I don't think they're wrong for doing so, but it's hard for me to believe that someone has given Linux a fair shot when they cite package management as something that Macias does better.


In my experience this is partly because Homebrew has steadily moved to a "knows better than you" stance and every time you ask it to do one thing, it goes off and does a bunch of other things first.

MacPorts is well worth a spin if you're on a Mac and have trouble with Homebrew. I switched sometime last year and am very happy. I think I've actually been using it more because I know it's not going to be a pain every time.


I've been a MacPorts user since the days of Fink vs. MacPorts and tried and failed to use brew many times.

The biggest argument at the beginning for Homebrew over MacPorts was binary distribution and now MacPorts has that and can still compile custom versions if necessary.

MacPorts is always the first thing I install on a new Mac and has been for over 15 years. The several times I've seriously tried Homebrew, I get frustrated (terrible jargon, missing packages, spewing files all over my system, lax security) and go back to MP.


I've been using Homebrew for years and, although I've not had any major problems with it, I do find the slowness irritating. And also the fact it tends to ridiculous bloat by keeping previous versions of everything installed, unless you remember to do some spring cleaning every now and then.

I'm up for giving MacPorts a go. But is there a pain-free way to transition between Homebrew and MacPorts, without having to reinstall everything over again? I've installed a load of stuff with Homebrew, over the years and don't fancy having to pick through it all again to try and remember what I installed and why.


While Homebrew is absolutely slow, the experience of package management on macOS (homebrew gets you bleeding-edge versions of the crap you install, whereas Apple manages the system and keeps it stable) is IMO better than Linux where the choice of stable vs. bleeding-edge is systemwide.

Obviously requires significant setup, but this[1] seems to solve the issue you are talking about on linux.

[1] https://bedrocklinux.org/


Yeah sigh, homebrew had definitely slowed down in recent years.

> in that time on something like pacman

If you're using Pacman (and presumably Arch or some derivative), that might explain your better experience on Linux. You're on a rolling distro so you get up to date packages (and I believe AUR gets things even quicker).

Homebrew is much closer to that rolling always-up-to-date (within a day or two usually) experience than Apt or Yum on Debian/Ubuntu/Fedora (most people's linux experience), where you have a choice between packages that are 3-6 months out of date or hunting around on the web for a 3rd-party repository to add which contains more up to date packages.

Homebrew can be slow. But the fact that I can `brew install X` or `brew upgrade X` and get on with something else while it's doing it's thing means it doesn't generally take up much of my time.


> Homebrew is much closer to that rolling always-up-to-date (within a day or two usually) experience than Apt or Yum on Debian/Ubuntu/Fedora (most people's linux experience), where you have a choice between packages that are 3-6 months out of date or hunting around on the web for a 3rd-party repository to add which contains more up to date packages

It still has a centralized list of the versions of the packages it supports (on github) though, right? I feel like it should be able to get this list with a single HTTP request to the Github API; I don't see why having a faster or slower schedule to update the packages should affect how long it takes to actually download the list of updated packages with `brew update`.


> The visuals of Mac are vastly superior to any flavor of Linux I've tried.

This is subjective. I think the MacOS UI looks and acts like a toy. My Linux machine has a solarised theme (I can toggle light/dark) with minimal window borders (i3, polybar). I think it's much more tasteful than MacOS. GNOME looks better too and it's themeable, unlike MacOS. But this is all of course just my opinion.


I agree. I dislike the Mac UI, particularly Finder. The one Mac tool I use and love is the screenshot shortcuts. Other than that, 99% of my time is spent in the terminal and Sublime Text, and Firefox.

At this point FreeType is generally better (faster, more featureful) than Apple's Core Graphics, although FT is very customizable and I prefer Apple-like defaults to the ones most Linux distros use. You can definitely get Apple-like rendering out of FreeType; disabling hinting and making sure stem darkening is on gets you most of the way there.

> hardware drivers on Mac just work

Same for GNU/Linux if you choose the hardware for it (or buy preinstalled).


Try getting hardware accelerated video in Chrome, or FF. Not supported.

Interestingly enough this causes all kind of weird distortion on my brand new 16" MBP. Not sure if it's a hardware issue, but it only happens in the browser with GPU acceleration turned on...

hardware acceleration has been supported in FF since 75. however your display scaling has to be 100% on x-org or use it in wayland. Linux desktop would've been perfect if it weren't for the fragmentation. btw currently running ubuntu 20.04

In my experience this argument is BS. Maybe the computer on day one works OK, but it's downhill from there as you maintain and update.

Huh, in twenty years of using Linux, hardware support has improved with each update not gone down hill.

What I do agree with is that on day one support might not be as good for new types of hardware (especially for more niche hardware like a fingerprint reader for example). Even that has improved a lot over the years though.


It really does depend on the hardware.

Thinkpads usually work really well out of the box (even the nipple mouse works without a hitch).

I have plenty of horror stories about other computers though. Usually involving being deep in some page tree on a company's website that they don't seem to have actually expected anyone to go to, trying to figure out which specific vendor id maps to the actual hardware I have.


> ... but it's downhill from there as you maintain and update

I've been maintaining a Fedora desktop installed 8 years ago without a snag, updating every year or so as time permit. Not a single problem.


I did exactly that with an Asus Netbook, then Canonical and AMD broke the whole experience with their actions replacing working binary drivers with WIP FOSS alternatives, because Free is Good ™.

Until you have to connect to a 12 year old printer at work

Mac and Linux actually use the same printer stack: https://en.wikipedia.org/wiki/CUPS

Most corporate printers are network attached, many also run under CUPS

> Applications and hardware drivers on Mac just work.

... until Apple decides to break an API and not provide equivalent functionality, and then you end up with a perfectly functional soft-bricked paperweight. I can't use my external USB monitor because of this.


I switched to Ubuntu for work about a year ago, I've done zero fiddling since the day I set it up. Linux has really come a long way even in the last 5 years, IMO. I still have a 2015 MBP that I use for my personal work and it's still solid.

Same, as of late last year I now run an Ubuntu desktop and the Macbook together. What I really like about this setup is the ability to sidestep some problems instead of having to solve everything, allowing me to focus on doing paid work instead of troubleshooting the platform.

Except when they broke most Homebrew installations a few years back (Mojave maybe? Been using Linux for a while now too, that was my tipping point)

Everything on Ubuntu just works, too.

Except for fractional scaling if you use the proprietary NVIDIA driver. This was an issue on the latest LTS release.

It became worse for me since I upgraded to 20.04. Now central screen in 3 screen setup gets panned, which can only be fixed from Nvidia control panel + graphics manager restart. It breaks on the screen lock. Don't upgrade to 20.04 yet, if that's what you have.

I've been running Linux on Macs since my 2005 12" PowerBook. There were a couple years in the middle here I switched to OS X with Linux running in a VM due to hardware support issues (the VM was a pain in the ass to use; there were no end to performance problems). I never liked OS X as an OS all that much. It was fine, but I found it got in my way more than it helped me.

In 2016, after getting tired of various hardware bits never working just right on Linux, I gave up and tried someone else's hardware. It was amazing that everything just worked with Linux. I didn't have to fight with bleeding-edge, just-reverse-engineered drivers. I didn't have to wonder if the laptop was going to wake up again after being put to sleep. But I chose somewhat poorly: the hardware I chose was decent, but rough. The battery expanded to the point where it warped the case, which caused the keyboard illumination to intermittently not work, and eventually made the touchpad impossible to use.

So I gave up in 2018 and got a MacBook Pro again, though it was a secondhand 2016 model. Once I installed Linux, I found that the keyboard and touchpad didn't work, and I had to build an out-of-tree driver for it, ditto for the WiFi. After that I found that audio and suspend-to-RAM didn't work. It turns out I was "lucky" that I had an older model; the keyboard and mouse setup in the 2017 and 2018 models hadn't been reverse engineered yet and didn't work at all.

In 2019 I got fed up with having to use the USB-C headphone adapter from my Pixel in order to hear sound, and to remember to shut my laptop down completely if I was going to be away from AC power for more than a few hours. I again left the Apple hardware world. It's not perfect, but again I marvel at how everything just works (with Linux) without tweaking. I hear so many colleagues complain about various inexplicable broken things on macOS all the time, and I think I've gotten the better deal.

Is Linux as polished as macOS? Of course not. I agree with you that the graphics aren't as smooth, font rendering isn't as good (though this is debatable; personally I find Linux rendering to be more crisp and readable than macOS), HiDPI support varies depending on what toolkit any given app was written with. But it works well enough. I think one of the biggest problems that has kept (and will continue to keep) Linux from going mainstream is that even though for the most part it works just fine, when it does break, it breaks spectacularly to the point that you need specialized knowledge to fix it. Fortunately I have that specialized knowledge, so I'm fine, but your average home user does not.

I don't have to deal with Apple's keyboard design issues, random WiFi failures (yes, it is incredibly funny to me to find that Linux WiFi can work better than macOS), Apple's overzealous "system integrity protection", or Apple in general deciding they know better than I do as to how I want to use my computer. It makes me happy, and I feel productive.


You are quite persistent on Mac hardware. Why?

It is hostile platform, it should be rough. I use 14" Dell Latitude - no problems at all, beautiful machine. Probably would stay on it or try Asus ZenBook.


Because the hardware is beautiful and the build quality is (was?) excellent. Recent fumbles like the butterfly keyboard design have definitely reduced that perception, though.

I have a Dell XPS 13 now, and while I like it, it's just not as pretty as a MBP. I recognize that that's a highly subjective judgment, but that's a thing that matters to me. The XPS's build quality is good, but not amazing; for example, I've had it for less than a year, and one of the rubber strips on the bottom of the laptop has been peeling off for the last few months.

Overall it just makes me sad that "hostile platform" is even a thing. At this point I've decided that the alternatives are good enough for me, and the better hardware support tips the scales in favor of them.


Good experiences with the ZenBook line here. Pop!_OS worked out of the box with everything but the fingerprint reader, including the Optimus GPU.

On my 2011 MacBook Pro I have run various Linux distributions since 2013. Mostly debian and now solus. Both have been a joy. I have not experienced the same issues that you describe. All my hardware just works out of the box.

Same story here

The window animation delays in mac are killing me - kind of in a month you lose hours of screen animation time. also macos has ZERO use case as a cloud OS. Any stuff you learn is meaningless except the unixy ones where linux rocks.

1) You can turn off windows animation, and it seems strange that it is a performance bottleneck.

2) macOS is not a cloud OS. It certainly can be a portal to the cloud quite easily, via terminal, iterm2, ssh/mosh, VNC etc. But you know that already.

You can also run VMs, KVM, docker et al on it as well.


1. it's better to avoid macos totally. 2. u repeat what i said. linux is better in every detail

Software that is unique to MacOS:

- HoudahSpot (search files / documents)

- DEVONthink (document management system)

- Ulysses app (a markdown writing app; I know there are many, on many platforms)

- iA Writer (another markdown writer)

- Alfred App (automating)

- Keyboard Maestro (automating)

- AppleScript (automating)

- MailMate

- Things3 (task management)

- OmniOutliner

- OmniFocus

- OmniGraffle

- PDF Expert (reading / annotating pdfs)

- iTerm

- aText (text macros)

- Timing App (auto time tracking)

and many more.

Many of those apps have their equivalents in the Linux / Windows world, but their Mac pendents are imho much better.

Finally, it's the overall experience. Everyting is streamlined, everything works across iDevices, everything (mostly) works. I can't say it in a few sentences. Sitting in front of this machine, coding, writing, listening to music, that's a joy. I didn't have this feeling with a Windows machine (was a Windows and Ubuntu user until 2015).


Agreed about the Windows feeling, though the reason I have a Windows laptop (which I use personally more than my MacBook) is that outside of development tools it does far more than Mac or Linux machines.

Have you noticed how all your arguments are fine yet only personal preferences? (except dataloss)

The UX, UI and 'feel' is what makes people use one thing over another. Unless you can replicate that, or can do better, nobody's getting swayed either way (and that's fine, do what you want).

There is no objective 'my pc is better than your pc' because you're not the same person as the one you'd be attempting to compare against.

Better hardware and better drivers are useless arguments if someone wants to go to a shop, take a device home, plop it down and do some work. None of the betterness of the other stuff is going to undo the work, or undo the experience.

It's like telling someone their 24-pack of toilet paper is stupid because your shipping container full of toilet paper is cheaper and you have more than they have. (especially when all they wanted to do was wipe their butt)


How is better hardware-to-price ratio and driver performance a "personal preference?" Do you not want those things? If you're saying that having a particular UI is important, that's another matter. I'm just explaining what it is that caused me to ditch the Mac ecosystem.

No, those things aren't the main driver for my choices.

Say you have two options, same price, different vendor. And the main difference (for the sake of the argument) is that one comes with a mobile i5 and the other one with a mobile i7. Technically the price-to-performance ratio is better for the i7 model. But the driver for me to buy any device at all is wether the device can do the task I need it to do. If the i5 does the task the same way the i7 does the task, what is the extra value that the i7 supposedly brings?

Often people will make a 'what if' argument here, because "what if you need to do something different" sounds like a nice argument to get more than what you need. In reality, I don't often see anyone changing their tasks mid-lifecycle and then have a sudden requirements change. It hasn't happend for me at all.

The same parallel can be made with cargo capacity of a vehicle; technically every Lamborghini ever is a stupid choice because you can't move your furniture with it. Or every Fiat Panda is stupid because you can't drive at 200 Km/h speeds. Make it more absurd: every boat is stupid because it doesn't fly like a rocket.


Linux is objectively more configurable too (obviously).

This is probably my biggest complaint with macOS.

I'm still shocked by the inability to disable inertia scrolling on a usb mouse, as well as acceleration.

These things were removed on purpose.

If you use a macbook with usb devices, there are all sorts of oddities. It's annoying. I can use it, it's just not for a power user. For me, power user means "configure it", shape it to maximize your workflow speed.

MacOS is just not that. With linux there is a lot more flexibility.

I don't care too much about eye-candiness, I want my UI to be incredibly fast. MacOS has those damn animations when switching spaces that is annoying to say the least (I switch spaces very often)


I use Safari as my main browser now due to the password generation/sharing with my iPhone. It's a killer feature for me.

That's a commodity feature at this point.

Assumptions: 1) All my devices are Apple 2) I'm not planning on leaving Apple

For all the negative replies I wonder how many people actually have used Keychain. It's very good. I tried onePassword, google's, and firefox's, but none integrates as well among all my devices and with zero effort in the OS as keychain. Anyone leaving negative comments about keychain, at least point out that you've given it the ol' try in the Apple Ecosystem.

Or we have different assumptions and you have devices outside the Apple ecosystem or don't want to be tied to it therefore the benefits I see to keychain don't apply to you.

Also the WWDC yesterday has some cool new features about bad passwords (But I think other password managers already do this).


Firefox does this too FYI

I like Firefox but I don't use it on my iPhone. I use the Apple pw manager to log into websites on both phone/laptop a lot. The syncing to my iPhone is great.

As does chrome and edge.

Pray you never have to bulk export your passwords. It is extremely painful.

Almost every browser has this feature...

But "almost every browser" doesn't sync your saved passwords seamlessly between devices (mobile and desktop) with an end-to-end encrypted database, and automatically offer to autofill them, with the autofill itself protected by Touch ID/Face ID on the devices that support those.

For one, Bitwarden does all of that and more (I guess except the Face ID thing, but I don't really want that so I'd disable it anyway). It has web, mobile and desktop apps, and browser extensions. All available in the free plan. Oh, and it's open source and self-hostable.

https://bitwarden.com/

(Not affiliated with it, just a pretty happy (free) user!)


Neat, and glad to know about it! I've actually been surprised (and a little frustrated) that there haven't been any alternatives to iCloud Keychain for genuine seamless, e2e encrypted, password management.

Does it actually work the same as Mobile Safari's interface on iOS? Automatically presenting options both for creating and autofilling passwords on webpages? (If it can do that in iOS Firefox, I might consider it...I've been using desktop Firefox for a while because of some bizarre idiosyncratic issues I have with Safari on my laptop, that I can't reproduce anywhere else, but are extremely irritating where they do happen...)


Yes, it does. The integration with both Safari and Firefox on iOS is excellent. In Settings -> Passwords & Accounts -> Autofill Passwords you just change the backend from iCloud Keychain to Bitwarden, then it otherwise behaves substantially the same.

A big benefit of switching to Bitwarden is that you can also get it up and running on non-Apple machines. Kicking myself for not doing it earlier.

Just be aware that bulk exporting your passwords from iCloud Keychain is non-trivial.


Lastpass does this, but works with apps as well. Third party password managers are absolutely underrated, because most of them are truly cross-platform.

If you create a Netflix account on your desktop in Chrome, it can generate a password for you, or auto-grab whatever password you typed in, and will autofill that password when you open the Netflix app on your phone. (Based on a built-in association between "netflix.com" and the Netflix android app.)

Super handy feature when you get a new mobile device and need to get logged in to all your apps for the first time.


Chrome does that. If you set a passphrase for sync, then your data is e2e encrypted.

https://support.google.com/chrome/answer/165139


Use a password manager. It's safer than trusting Apple or Google who are after your personal data.

Apple never gets your saved login credentials. iCloud Keychain is end-to-end encrypted, and Apple hasn't been shy about advertising that fact.

> Apple never gets your saved login credentials.

Yeah, keep telling yourself that. They are a part of PRISM.


Also worth noting is that with recent iOS versions, third party password managers are recognized at the OS level, and can be used with the same seamlessness as iCloud Keychain.

Apple and Google aren't going to sell users' login credentials, and if law enforcement is requesting information, they will go directly to the provider who contains your data they're seeking.

Apps shipped with macOS are actually good imo, or there's just not a lot of better alternatives. Never found a better alternative for itunes or mail (used Thunderbird and mutt for a while and they're either bloat or have bad UX). macOS designers seem to be good at cramming functionalities (i know it's not unix philosophy) while making it not look bloat at all.

Mail is a pretty awful client imo, but still better than most other. Canary and Airmail however are absolutely amazing (I recently switched to Canary from Airmail) and there really isn't anything comparable on other platforms.

Mail.app is an appalling client. Having recently switched email providers, I tried copying several thousand emails from one mailbox to another. Mail would invariably crash after moving several hundred, and then could not be restarted without immediately crashing. I lost count of how many crashes I reported to Apple (aside: has anything ever come of these reports?). Mail is so bad, it did what I didn’t think was possible - I switched to Outlook. And guess what? Outlook handled me email-moving task faster and without error.

Internal quality control for macOS and built-in Apple apps has fallen off a cliff in recent times. Catalina has been disastrous in that regard. Having used Macs since OS/X Leopard, I’m not jumping to Big Sur. I switched from Windows because it was a buggy, hot mess. I’m not hanging around and risking my work as macOS quality control keeps falling. Apple’s developers have lost my trust.


I once had a similar need but outlook failed me hard, never tried with Mail.app but I'm sure it won't go well. In an ideal world i would just make all software i use myself cuz I've only seen like <10 actual good softwares in my life

> Apps shipped with macOS are actually good imo, ...

As long as your standard is low, sure. Scanning with Image Capture results in needlessly large files and it has turned to be quite a piece of crap. Moved my scanning pipeline back to Linux, and now, I've got text recognition for free !


Some are good. But many have better alternatives. All mac apps leech your data to the cloud though. And that should be a huge no from all of us.

>and my downsides are... what exactly

Less cohesion, competing GUI libs/configuration systems/etc, still not 100% GPU support, the Wayland situation, almost non-existent multimedia (video, audio) options (not to mention support for peripherals like audio interfaces), and being at the worse than Apple's arbitrary mercy of your favourite desktop environment programmers (e.g. abrupt transitions and bizarro decisions in Gnome and KDE major versions).


> still not 100% GPU support

I'm a graphics engineer so I'm not sure I understand this argument. When I talk about weaker and worse-performing hardware options and drivers on Mac, I'm referring _mainly_ to GPUs.

> Non-existent multimedia (video, audio) options

This... is simply not true (?)

Also, regarding Gnome/KDE, I dislike both and don't use either. Something like i3 with zero-to-know configuration is already pretty great out of the box.


>When I talk about weaker and worse-performing hardware options and drivers on Mac, I'm referring _mainly_ to GPUs.

And when I talk about weaker support for GPUs on Linux, I mean the Wayland / compositor situation, getting 3D to work without planning what specific model of laptop to buy to use with what specific distro, and so on...

>This... is simply not true (?)

This is simply true. It can't run Cubase, Logic (ok, that's expected), Pro Tools, Studio One, Reason, FL Studio, Maschine, and tons of other things besides. Your best bet is some niche players like BitWig and Traktion, or crossing your fingers with Wine... And let's not get started for drivers for professional audio interfaces and peripherals...

Same for NLEs...


There are no good mail clients for Linux that I can find. All of them are great, except for one or two things that makes them completely awful.

> (seriously, safari is the new internet explorer).

I'm not sure what this is supposed to mean. Safari is holding back the web because despite its monopoly on users it doesn't support new technologies? That's not true at all. Especially because it's got such a small market share, it should be up to them to deliver what they feel makes the most sense, and let the market decide the rest (though sadly the market seems to keep using Chrome, which is terrible).

Honest question though: as a Safari user almost exclusively for the past 15 years, what I missing out on that I could have experienced if Safari had supported WebGL2? I don't know that I've encountered any sites or use cases where Safari was holding me back in that regard, but I'm curious to know what was out there that I couldn't see before.


I rely on TimeMachine to backup my MAC.

AFAIK, windows and linux still don't have suctions tool that can backup/restore without pain.


Does your benefit comes with battery power and portability? I couldn't care less about speed when I have a 64 vCPU box at 10ms delay.

Linux is great for development work and casual web browsing, but imo it doesn't feel polished enough for most people's personal stuff.

> Itunes, Mail, etc, none of the software shipping with MacOS is actually any good. I

It's better than the open source stuff on Linux. While it has gotten much better over the years, it's still not good enough for normal people like my parents or my sisters who are not techies. That's just reality for now at least. It's pointless as to whether or not you agree with it.


I maintain an old Mac for corporate purposes, but yes a Mac is just tedious. While things like printers “just work” in Linux they really struggle in OS X, requiring “drivers” like its 1999.

The other benefits of Mac hardware (MagSafe for example) have all been removed leaving not-a-lot.

YMMV.


Sometimes it feels like I live in another dimension. I worked with dozens of developers using macs that never make complaints beyond little niggles, then I come on here and you’d think the entire ecosystem was a dumpster fire. It’s enough to make me think I don’t work on the same system they’re describing for 60+ hours a week.

For me, the React Native experience has been instructive. Android is quick and easy to develop, debug, and deploy. The iOS side includes multiple hoops, constant frustration stemming from XCode, long app release times, and minutes-long builds for production-size apps.

Our team, without any policy dictating daily development habits, has coalesced around Linux/Android development with iOS testing before release.


As a consumer, I've never used a React Native app that I liked using. It's always janky and commpromises UX in ways that made me simply use the app less or had the effect of teaching me to "get in and out of the app as quickly as possible".

> As a consumer, I've never used a React Native app that I liked using.

If it is a good react native app you won't know it is RN.

If you notice it, it is by definition poorly done. There are some things (e.g. specific animations, UI elements) that RN just isn't great at, the best way to avoid making a janky app is to avoid even trying to do those things.


Non-native menus and poor keyboard focus handling and shortcuts are the giveaway, and extreme memory usage!

That’s your bias. Most consumers have mo clue what react native is

As with everything, it depends on the implementation. There are some things I'd like to clean up about our app, but overall feedback has been very positive.

> Our team, without any policy dictating daily development habits, has coalesced around Linux/Android development with iOS testing before release.

Interestingly our team has gone the opposite way (again, no official policy). The deciding factor in our case has been the relative performance of Xcode/iOS Simulator vs Android Studio + Android Emulator. Android Studio on it's own slows my computer down more than Xcode compiling AND the iOS Simulator running at the same time. The Android emulator is all but unusable on my MacBook.

I agree there's a lot of painful bits around signing/release on iOS though.


I recently ran into some trouble with some tools installed in /usr/bin a while ago, and wanted to try making some changes.

Even with root, you can't touch /usr/bin. It took me way too long to try to figure out what was going on, and all the fixes were hacks.


I may be missing something here, but why would you need to change the stock installed tools in /usr/bin? Seems like an easy way to screw up your OS installation. And it's not like a VM where you can rollback to a snapshot or relaunch.

If there really was an issue there then you either need to file a bug and have them mainline a fix, or yes hack/shim a fix on top for your needs. Perhaps leveraging PATH precedence.


Embedded in your response is the general attitude one hears when concerned about not being able to do "a thing" in the Apple ecosystem: why would you want to do that?

Questioning the use case and insisting that one doesn't actually _want_ to do a thing instead of allowing the user to control their own system is the quintessential Apple experience.


You can still do this thing, you just do it a different way that doesn't fundamentally risk breaking the OS. I understand that there may be some non-zero sized group of people who absolutely want to screw with protected OS files, and even for this group of people you can go and disable SIP and mess with the OS all you want (one of the Macs I have is a hackintosh, which requires some decent modification).

However most people, including me, and I'd venture most engineers too, would rather have a hardened system.


> Embedded in your response is the general attitude one hears when concerned about not being able to do "a thing" in the Apple ecosystem: why would you want to do that?

Funny, that's a quote I hear a lot in Linux Desktop ecosystems as well. I think it is just the nature of people so accustomed to a certain way of thinking that any other use case that comes along is automatically considered to be doing it wrong.


Linux doesn't have a single desktop ecosystem. Apple very much has One Apple Way.

If Linux is going to get attacked for not having a single GUI, it would be nice if it weren't also attacked for having a repressive GUI monoculture. /s


> I may be missing something here, but why would you need to change the stock installed tools in /usr/bin?

Because it's my computer that I paid for with my hard earned cash, and I want to.

How far we have fallen.


Then you should probably disable SIP and go do whatever the hell you hope to accomplish with that.

I can't agree with this more! It's a bleak future. Back to Linux I guess.

Apples open source devtools are old as dinosaurs, so that might be a common case.

But more importantly, it's the question whose laptop it is. I continue to think that the tools I build go to /usr/bin, because that's the way I like it. Apple is telling me I'm liking it wrong.

As for filing a bug with Apple - good one. Every single Apple dev considers radr:// a black hole, and the chance of getting a fix from Apple because of bugs filed (vs. Apple wants to fix it anyways) is slim to none.

Overall, Macs are more and more machines that want to prevent shooting yourself in the foot, at the price of less flexibility and access. This is a good choice for some, it's not a good choice for me. (And many other people who like hacking their machines)


>Apples open source devtools are old as dinosaurs, so that might be a common case.

Yeah, but there's a certain expectation that the tool that you have installed in /usr/bin is a certain version. There's a reason why tools like Homebrew generally do not overwrite built-in tools.

If you just replaced /usr/bin/python with Python 3, you'd probably break all kinds of things.


The point is, it is my machine to break. Apple is more and more deciding that I don't get to do that. It's a choice that benefits a large class of customers, but it's detrimental to people like me.

macOS used to be "Unix, but with a great GUI". It is turning into "iOS, but with a few command line tools".


I had other reasons that I needed my PATH to be in the order it was.

> Even with root, you can't touch /usr/bin

Yes, you can. If you disagree with Apple's System Integrity Protection, you can take two whole minutes of your life (if even up to that, considering how quickly OSes boot these days) to turn it off permanently and mount the entire drive as read-writeable. For some strange reason, though, I tend to encounter far more people complaining about it than actually taking the steps needed to fix the "problem".


Perhaps what you need is Gnu core utilities - https://www.gnu.org/software/coreutils/coreutils.html ?

Just install it with macports and you can use all the tools by prefixing it with a g - gls for ls, gdd for dd etc. Some of these tools are newer versions than on macOS and hence improved (as GPL3 prohibits Apple from including newer versions).


Like I said, I could hack around it eventually.

It was annoying that I couldn't do the basic things I expect to be able to do on a machine as root


which is what /usr/local/bin is for, right?

I can't speak for why the OP feels this way, but I think one of the main gripes I've seen from "power users" of the latest MBPs are the touchbar and Apple's continued insistence on it and the larger trackpad that some find obtrusive (these features are generally embraced by end-users). Then of course there's the lack of USB ports that some take issue with and the fairly disappointing built-in webcam on the latest models. These are not problems for everyone and most have workarounds, but some "power users" are turned off by this. The abandonment of the butterfly keyboard is definitely a step in the right direction.

I would also consider myself to be a power user. However, one huge annoying problem I've encountered is sharing homebrew between multiple users. It seems like it's not possible for homebrew and all of it's libraries/binaries to be owned/usable by more than one user at a time.

Apple should roll their own package management tool or officially support homebrew.


The homebrew developers (of the homebrew framework itself not all the random app maintainers) have been "doing it wrong since day one. It is utterly and unforgivably wrong to install system-wide binaries owned by a normal user outside of the users own home dir. That's what they were doing for years until fairly recently when Apple had to simply change the os to break their install, forcing brew to change it, finally, against their will.

Macports did it right, but "brew ..." is just a cooler catchier name.

It litterally comes down to that. The more fun name.

When someone is new and confused and overwhelmed, a slight difference like that, or the design of the website, or the charisma of the people talking on forum posts, is the tipping factor in which of the 11 possible things they try.

And as long as the first thing they tried worked, they keep doing it and quickly gain a rewarding feeling of accomplishment and confidense doing that thing.

At that point this is their home and their comfort. They aren't bothering with anything else when it doesn't seem to be any different. They conclude they chose wisely the first time (which is a good feeling that anyone can always simply decide to reward themself with, ie confirmation bias).

So, no one should have ever used homebrew. Homebrew "won" anyway, but not by being the better-architected more technically correct system with wiser engineers.

I forgive all the users for not realizing that the directions they are terrible and broken. There is no excuse for the developers writing that system.

So, your problem with homebrew, I say, the problem is homebrew not Apple.


> It litterally comes down to that. The more fun name.

I can't speak for anyone else, but I switched from MacPorts to Homebrew, after having previously switched from Fink to MacPorts many years ago despite not only being told but having actually experienced that Homebrew was slower and more fragile. And the reason was not "the more fun name." The reason was that both MacPorts and Fink were incredibly slow at expanding and updating their ports tree. Stuff that I knew from my Linux and FreeBSD days, or cool stuff that I read about on weblogs or in places like Slashdot and HN, was nearly always right there in a current or new-current version in Homebrew. That was rarely true of MacPorts: often it wasn't there at all, and if it was, it was often several minor or even a full major version behind.

This may have changed since then -- I hope it has, since I suspect it's been over a decade at this point -- but IIRC, the last straw was trying to install the terminal version of a program (either Emacs or Vim, I think) that also had a GUI version available and, after going to make coffee and coming back to the computer, discovering MacPorts (a) didn't care that I had explicitly asked for the terminal version, it was going to install the GUI version for me anyway, and (b) because of MacPorts' aggressive "never depend on the system version of anything no way no how", it was building a new version of the entire bleeping XFree86 from source for me. Yes, I understand why MacPorts takes that "no system dependencies" approach; yes, I get that was probably a badly-specified port file. But I ripped it out and never looked back.

Well, I take that back. I think I looked back once, about four years ago, to see if everything that I had installed under Homebrew on a machine could be installed with MacPorts, because maybe it was time to give it another chance. The answer was no, everything could not be installed. So the answer was no, it was not time to go back.

I don't care that Homebrew has a fun name. I care that it has the stuff that I want kept reasonably current with upstream versions. Homebrew has, as far as I can tell, gotten a lot better over the years at not being fragile. It's still no speed demon, but the truth is that I don't actually run it that often. If a more native macOS package for a given piece of software is available, I'll install that in preference.


have you tried windows with subsystem before? I've noticed significantly better performance from windows 10 on Lenovo t580 (from 2018)than my Macbook pro (from 2029). I love win 10 + subsystem combo 10x more than my Mac that I now have moved all development and daily work over to my windows box. I only touch the Mac as a test box before running things in production, when it use to be the other way around.

>> I consider myself a power user, and precisely because MacOS feels like the best of Unix combined with customer focus [...] I simply can't relate to the statement that Mac is a "platform that contends with me every step of the way" and I'm curious to know more specifically why you feel this way?"

I just plugged an extra ssd into my my old thinkpad yesterday. My boot volume is mSata. The hard drive hole was empty, and I added it with one screw. Now I have a vm storage pool. It now has more storage than my gf's mac, and it cost 1/6th as much.

I tried something similar with her computer yesterday. Plugged in a usb SATA ssd. ext4 isn't supported. alt-tab is broken. Using the touchpad for everything is mandatory because hotkeys are counter revolutionary I guess? GNU userland is there but like half of it is broken or fake, like fs/mtab aren’t real? The console clipboard doesn’t seem to work. All the apps are already running instances without a window or something? Is this supposed to be like a phone?

I just want it to fullscreen tmux and leave me alone, but I keep having to search the internet for the cutesy name of the GUI thing that is the only way to do <trivial task>. I could format the disk and put the data back after, but I'm now afraid file io in python will be even more fun than I am already having, so I go find another computer.

I keep trying to learn how to use a mac, but it's always so much harder than just using any other computer that is around that I just can't seem to make time. I find the UI willfully contrarian, and more alien than anything I have every used. I'm not trying to be an OS fanboy here, I know lot's of smart devs who say they like it. I want to "get it" but I just don't "get it". Discover-ability seems poor, especially as a user friendly tool for beginners, but the UI and locked down software eco-system seems so limiting for devs. Like, who is this for?

My gf told to stop trying to think and just click on things… She’s a modeler/dev and she hates her mac, but her employer won’t let her use linux anymore. None of this feels like a dev or “power user” experience. It feels like using Windows XP because the boss made me.

Is this like that thing Neal Stephenson said about the difference between "easy to learn" and "easy to use"? Like if I learn to use it properly I will start having "ah ha!" moments and become more productive?

EDIT: I really don't mean to be flippant! OP asked why people find it limiting. They seemed like they know what they're doing, and might be a dev, and lot's of devs use mac.

OP called it "the best of Unix combined with customer focus", which sounds great to me! I just don't understand what went wrong...


I have the exact same experience. I've developed on Windows, and for the past 10 years on Linux. The experience for the latter has changed so much in terms of usability that I struggle to understand why alternatives are at all desirable for developers. Out of necessity I had to do iOS development on OSX, and out of the gate that already leaves a bad impression. I can cross-compile and target anything from anywhere and run in emulators or virtual machines and what not, the hardware can do it, the software allows me to use it, its all good. Except the mac ecosystem. So, purchase of a mac it is, install of xcode and the whole shebang. Very little is enjoyable, and less is particularly intuitive. I love my grandmother, and for non power-users who mostly wants to use a browser, a gnome based ubuntu seems a better option than OSX, and definitely also Windows.

As for how the experience was for me, in terms of developer and power user, everything mac has from unix seemed a bit half way. It certainly surprised me repeatedly, finding some versions of core utilities to be older and not supporting flags here and there. But this was fine. Homebrew felt sluggish. But this too was fine. Then comes iOS development, and oh lord what a shit show. This was not fun, and the release process and signing, and all that.

I've reached the point in my professional life where I want things to work. I don't wish to struggle with technical limitations caused by politics and marketing decisions. Development on OSX feels that way. You can't virtualize the OS, and can't emulate an iPhone. You have to go through way too many hoops to do what should be simple.

So, I honestly, and for no inflammatory motivation, don't see why people like to work OSX. I understand that you wish to do so if you are limited by software and tools that restrict your choice, making it either that or Windows. And I dare say Windows is, in total, even worse.

For usability and ease of use, as an OS, for use with free software and otherwise software that exists for the platform, it's in my honest and in an effort to not be too biased or fanboy-y (again, I wouldn't push my fanboyism on my grandmother), simply easier. For a power user and again not limited by language or tech, there is no competition. None. I'd chose to work for a different company if it meant I didn't have to develop on OSX or Windows. Life is too short.


"Power user" may be as much as changing Windows Theme. Not much is allowed so bar is not high. Through the page again and again "users don't care".

Linux changes expectations. Everything is possible, there should be package for that, deconstruct, throw away, replace, and most of the time someone already did it.


>> I don't wish to struggle with technical limitations caused by politics and marketing decisions

So much this. When I depend on proprietary stuff, the care and feeding of licenses and serial numbers and all that stuff never seems to end, and it's shackled to one computer or host name or MAC address or something like that, which changes faster than calvinball.

It's not even about money. Who has time for all that busywork?

If mac platform was freely redistributable and Linux was proprietary, I would switch to mac in a heartbeat even if I hated using it, just to protect my time from doing all that IT stuff :p


This - precisely. I live on a Mac all day every day. Does everything I want, exactly how I want it, and makes me so much more productive than if I had to fight with Linux all day long.

I am a developer and power user. I am incredibly happy with every update from Apple. Unix with good design is exactly what I want from my OS.

Sure I can't disassemble my laptop and change the hardware. I have no desire at all to do that. I literally can think of no reason I personally would want to do this, although I imagine there are a tiny fraction of users who would and you may be among them.


Easy. A new application you have needs more RAM. Your battery is not holding up very well and you want to replace it. Your SSD will fail soon and you want to replace it. I had every single one of those issues on a laptop, and fixing them cost me from 25$ to 100$, about a fifth of what the Apple Store or OEM was quoting me. Instead of being without my laptop for a week or two, I got all of them over in an hour and a half tops.

Everyone has a device fail them. Then you have the choice of changing it or repairing it, repair is almost always a better option if you're at all technically competent.

I can't imagine my SSD breaking down or my RAM failing and having to change the entire mainboard when I could just take 45 minutes and fix it. It's literally absurd to me.


Your scenario definitely exists, but it also doesn't hit the likes of the user you respond to as much as you might expect.

SSD's don't fail the way they used to, RAM requirements don't change as much as it used to and while batteries do die, at this time I've seen more people switch machines before any of that happens.

The last time I wanted or needed to replace memory or storage in a laptop must have been 7 years ago, and I don't see anyone else doing that either. Even desktop upgrades are somewhat dead at this point, except for a GPU upgrade if someone is a gamer, or storage upgrade if people still use direct attached storage and didn't get enough when they initially got the machine.

Just because a scenario that hits a small percentage of DIY users is getting hard doesn't mean it's bad for everyone else as well.


If SSDs don't fail, more RAM isn't needed, and so on, then why do people feel the need to buy a new laptop every three years?

I think the percentage of users that would buy a new, 1500$+ machine instead of a small 100$ upgrade is much lower than one expects living in a bit of an echo chamber. There's a lot of people that get burned by outrageous repair prices or that buy new computers when they could simply upgrade a part.

Ask yourself; if you've never felt the need to upgrade a laptop or to replace a break, then why really would you upgrade?


> why do people feel the need to buy a new laptop every three years?

Beats me. You will often see comments in these threads about people (like me) using nearly decade-old Macbook Pros to this day.

I stopped replacing laptops every 3 (or less) years after I stopped buying budget non-Apple laptops.


Well, a lot of people aren't upgrading old MacBooks because the new ones have been such shite.

I'm on Linux, and a laptop I can take apart and replace bits of, so I can see myself using this one for a few years more.

My Dell XPS 15 is suffering from Windows Rot after only two years. When I get a free afternoon I'm going to have to back it all up and clean it out, hopefully avoid reinstalling from scratch. For all the negativity I've had towards OS X, at least it's better than that.


Some people probably have some write-off system, or perhaps resell them why the devices still have some value so they always are somewhere 'in the middle'.

I personally tend to keep my most recent device and the new device at the same time for about 2 years because of vendor lag (happens a lot with those classical software vendors that take months between software releases) and because I need the ability to compare between versions of both hardware and software. The newest device gets promoted daily driver (usually many benefits there, as they often are lighter yet more powerful). In general it means 2.25-ish devices per 10 years.

Other hardware, like SBCs tend to rotate out slower, but generic x86 platforms rotate out faster because if a critical component fails the labour for finding parts and replacing them is too much vs. buying up to date replacements. Luckily, due to lower usage of those machines they last longer, so technically the Apple hardware made the non-Apple hardware have a longer lifecycle in my case.

There is some irony in there as I do provide board-level repairs and the machines I work on for other people are ones I'd never personally invest in.


Because the overall degradation. Sometimes I want 10% of 'everything', plus a new feature, plus less weight to carry around. That's practically been the only driver of my personal upgrades over the last 15 years.

And by degradation I don't mean the existing device degrades per se, but the degradation of productivity on the current device vs. new device.

At the same time, the way work is done has changed a lot for me and the people around me: heavy workloads are almost never done on a laptop anymore.


> but the degradation of productivity on the current device vs. new device.

What does this mean? Do you mean your current device starts working poorly (but that would mean "existing device degrades per se", which you ruled out) or that the existence of a new device automatically makes you perceive your device as being of "degraded productivity"?


For example if I run something that can take advantage of AVX512 and my current CPU doesn't have that but a new CPU does. Same goes for TB2 vs. TB3, very useful when you want to connect an external GPU. It does work on Thunderbolt 2 but the extra bandwidth of Thunderbolt 3 is a nice improvement.

Say you change your workload model there might be ~20% improvements between bare metal, virtual machines and containers. If you simulate a part of infrastructure using containers you may not need more RAM, but more CPU would be nice. But when you then want to do a lot of recording/capturing and process that, RAM gets more important. Just upgrading the RAM wouldn't help much because without a CPU to generate the data you might as well offload the whole thing.


> For example if I run something that can take advantage of AVX512 and my current CPU doesn't have that but a new CPU does. Same goes for TB2 vs. TB3, very useful when you want to connect an external GPU. It does work on Thunderbolt 2 but the extra bandwidth of Thunderbolt 3 is a nice improvement.

This is definitely an echo-chamber/bubble point of view. The vast majority of users out there don't even know or care what AVX is, don't use external GPUs, and don't know or care about the difference between Thunderbolt 2 and 3.

If you personally need these things and want to buy a new machine every few years, then that's great, you should do that. But there are a ton of people who would benefit from an easily-repairable, easily-upgradable (RAM, storage) machine that end up dropping $1500 every few years instead of the couple hundred they could instead spend for a reasonable upgrade.


Seems you are responding to the wrong thread here. User the_af was asking me why I replace a machine and I answered with some reasons specific to me. This was a deeper dive in to the point that some people don't need to upgrade at all because they don't do anything different between day 1 of their usage or day 1780. And they don't need to because the laptops of the last decade don't fall apart as much as they used to and a Mac specifically tends to work well during its entire lifecycle. This is also why there aren't as much people interested in modifying their computers mid-lifecycle.

While I bet that there are a lot of people that do want to modify their systems, they are such a minority that it's not very logical for a large multinational to invest in that to the detriment of other goals. It might simply mean that you are not the target audience for their product(s).

Some other manufacturers/brands have the same, while others do a mixed portfolio to cater to smaller groups as well. We also have large manufacturers that cater to the classical enterprises which still run on the old idea that you need a fleet of identical machines and then swap out parts all day long, so machines that have facilities for that exist. Most notably Lenovo, HP and Dell do that.


Most normal people I know have 10+ year old laptops machines. People that buy laptops every 3 years are pros or enthusiasts that want the best performance.

Apple devices do hold their value fairly well (certainly much better than non-Apple hardware), so the cost is offset somewhat.

I whinge about the non-upgradability of our devices as much as the next guy, but it's not like people are throwing their 3 year old macbooks in the trash can.


How many upgrade their personal laptops every 3 years?

Work one, sure. But I haven’t upgraded my Mac in years.


Work one gets at least 4 years of life, it's not good for value beyond that due to write-offs etc, which is somewhat strange when you think about it. After that, they are sold to whoever wants them (we wipe them, clean them and unlock them).

Most people get close to 8 years before they actually want a new one when their work doesn't change much -- if your requirements don't change and your tool fits, no reason to change.


> I've seen more people switch machines before any of that happens.

My opinion is that people switch machines more often precisely because they're not upgradable/repairable. I think if most people could go to their local computer repair shop to get RAM/storage/etc. replaced or upgraded, we'd have a lot less electronics waste, and consumer computers would last a lot longer.

At this point, software CPU and GPU needs aren't increasing all that much year by year, unless you're a gamer or do HPC. For the rest of everyone -- the majority -- a CPU and GPU made 5-10 years ago is still just fine for what they want to do. These old machines just need more RAM and sometimes more storage.

I get that it's hard to fit sockets for replaceable modules when everyone wants a super thin laptop, but overall this is a source of so much economic and physical waste.


There are plenty of people switching machines that can be completely taken apart and replaced or upgraded component by component. Some other post around here mentioned eBay: plenty of completely modular HP and Dell laptops (and not even in bulk) for sale.

Most computer repair shops around here have disappeared because it's no longer the way people buy and use their computers. At least not enough people to keep those shops running.

Sometimes people upgrade because they feel like it. Doesn't always have to do with the numbers.


I agree in that, these days, I don't see myself wanting to upgrade a laptop's RAM before I'd just replace it with a newer machine. While it would be a nice bonus to have the option, realistically I imagine the 8 GB of memory that comes standard with the baseline MacBook Pro model would be enough for anything I'd personally want to do with a laptop in its lifespan, and I would be willing to sacrifice memory upgradability for a slimmer design, for example.

However, I think Apple (and other laptop manufacturers) cross the line when they ship soldered-in batteries with their laptops. Batteries are consumable components that are guaranteed to degrade over time. Unlike RAM, which, assuming Apple sources it from quality vendors, you can expect to last for many years, laptop batteries will definitely begin to degrade within a few years, and will, over enough time, render the laptop unusable. Apple is producing $1500+ laptops with a non-replaceable component that will, without a doubt, eventually break down. Your options at that point are to buy a new laptop, or to send it to Apple so they can charge an arm and a leg to replace the entire top case instead of just the battery (as that'd be impossible). You should buy a new laptop when you want to, not because you "might as well" without the option of battery replacement, and with Apple's "repair" option costing a quarter of the price of a new one.

Though to a lesser extent, it also irks me that the SSDs are soldered in, not necessarily because I'd want the ability to upgrade them post-purchase, but mainly because your valuable data lives there, and it will become trapped on a dead board if something goes wrong with an unrelated component. Even if failures like this are rare, the threat of data loss while using a MacBook would still concern me more than while using a laptop with a replaceable SSD. I believe Apple has a data recovery procedure, but you'd still have to send your laptop to them and cross your fingers in the event it dies, whereas if the SSD wasn't soldered in, you'd have the option of manually recovering your data, or taking it to a repair shop where they could do it for you.

I would love to see a MacBook Pro with, at the very least, a replaceable battery (but especially with a replaceable SSD as well).


Why are there so many Apple devices with 8GB RAM for sale on eBay then? The RAM prices for new Apple devices are 3x the normal rate so people get disappointed with all the Electron apps eating RAM and sell it!

I'm not sure that is an indicator of anything at all. You could speculate that people sell their laptops because they want more RAM and that is the one and only reason this is happening. But then that's like what, 100 MacBooks? 1000 maybe? Hardly large in numbers compared to any other laptop from any other brand on eBay.

I know that a lot of people are very emotional about their RAM and SSDs and it can be a PITA, but it doesn't affect as many people as you'd think. At least that's what the available data shows.


This is very true, thanks.

All that, and also -- Apple sets prices about once a year. Meanwhile, the price of CPUs, RAM and graphics cards drops. For any price level that Apple establishes, I can afford a Hackintosh desktop of much greater power or I can save a lot of money and get equivalent power.

Though, to be honest, I don't bother unless there's a particular application which isn't available on Linux.


Hackintosh is a non starter if you are using the computer for commercial use.

Turns out there are some very small niches where it makes sense. Basically, if you need to maintain compatibility with a larger organization but your profit is independent from theirs, and you have necessary technical resources at hand.

and you don't mind violating the licensing agreement.

Photoshop

Runs on WINE or in a Windows virtual machine.

The latter works fine for me.


Premiere

Final Cut Pro


That’s it? Great!

Sounds like non-iOS software developers that are stuck on macOS can just hop right off without any problem whatsoever...


Did Apple start using soldered on storage or RAM for (non-air) Macbooks?

I'm pretty sure the MBPs have had soldered-on components for quite a few years now.

Is it you Elon? DId you return from a few years trip to Mars? ;)

Lol I have never heard this. For me mbp=sodimm and easily replaceable drive while air is soldered on! What year did it change? I swapped my hdd for an ssd and upgraded the ram in my current mbp (which is a Sandy Bridge...)

> I literally can think of no reason I personally would want to do this

Being able to swap out RAM and storage and battery from factory-provided can extend the lifetime (and even increase the performance capacity) of a given machine at nothing more than the cost of the new parts. And Apple used to make it pretty easy to do this, and enough people did that there was a subeconomy of vendors selling for this purpose, although of course there were full-service options for people who literally couldn't imagine doing it themselves.

This also meant that downtime wasn't controlled by Apple Store service availability (literally had a drive fail on me once, had my bootable backup, swapped it into the machine, good to go).

And on a number of occasions I've done more complex replacements, a DVD drive here, a keyboard there. I can see why most people wouldn't want to do those, they're a giant pain, but having the option can be empowering.

What's actually hard to think of, if one is thinking, is what Apple has gained in return for this. I can kinda squint and see that irregular (and therefore, perhaps, less efficiently swappable) battery shapes have some credible advantages, but the rest of the marginal ounce-gains and dimensional-golf scores belong to a category of diminishing returns.


Being able to swap out RAM and storage and battery from factory-provided can extend the lifetime (and even increase the performance capacity) of a given machine at nothing more than the cost of the new parts.

It isn't just the cost of new parts. It's also the cost of those parts being larger in order to accommodate user-accessibility and replacement. When everything is surface-mounted on the PCB it can be made a lot smaller. This allows Apple to shrink the laptop as well as allocate more space for the battery.

No one is asking for phones where we can swap out the RAM and storage. Why do we want this from laptops? I'm typing this on a 2020 Air which I configured with 16GB of RAM and a 1TB SSD. I really don't anticipate a need to upgrade either of these things before the machine reaches end of life anyway.


> Why do we want this from laptops?

Why shouldn't we? I would gladly sacrifice a few mm of thickness for a machine where I could upgrade the RAM and storage (and replace the battery) every few years. Not only would that save me money, but it's much more sustainable from a manufacturing and waste perspective.

I just finally threw out my old (sadly broken) 12" G4 PowerBook, and marveled that it had a removable battery (you don't have to disassemble it; you just flip a latch on the external bit, and the battery slides out). And I remembered that at some point I'd done a trivially-easy disassembly at one point to upgrade the HDD. I got that laptop secondhand, when it was already a couple years old, and it lasted me a good five years, and probably would have lasted longer had my then-girlfriend not dropped it on concrete, which somehow fried the drive controller.


My Air is 2.8lbs, far thinner and lighter than a 12” PowerBook G4 (which I used to own as well) with way better battery life. I don’t want to go back to an old brick of a laptop like that. I think most people don’t.

You’re part of a very small niche. The tinkerer who doesn’t want to jump to Linux. This is a tiny number of people, sadly, just as the number of people who want to tinker with their cars is very small. Most people just want something that works and is very convenient. They prefer to leave the service to service people.


> My Air is 2.8lbs, far thinner and lighter than a 12” PowerBook G4 (which I used to own as well) with way better battery life. I don’t want to go back to an old brick of a laptop like that. I think most people don’t.

Not saying you'd have to. Battery technology has improved since then; we can make lighter batteries that last longer. Hard drives have been replaced by small PCBs with a few chips on them; again, much lighter. Further miniaturization of the internals means a smaller chassis which means less material; again, much lighter.

It's be perfectly possible to build a laptop with similar dimensions and weight as the current crop of laptops, but allow for easier battery replacement, and even RAM/storage replacement. I'm not saying it'd be as simple to swap as it was in 2005, but it'd at least be easier/possible and not involve special tools.

Apple seems to implicitly claim that their anti-repair/anti-upgrade stance is due "delighting customers" with smaller, lighter hardware, but I suspect it's mostly driven by their desire to lock down their hardware against tampering, and keep people on the upgrade/purchase treadmill every few years.

> You’re part of a very small niche. The tinkerer who doesn’t want to jump to Linux.

I actually do run Linux (previously on Mac hardware, but I finally gave up on it last year), so that's not the niche I'm a part of.

But that's not really the point; I'm not speaking for myself, I'm speaking for average users. I see a lot of posts here claiming that average users "don't think about" this or "don't care about" that, and I posit that the average user doesn't care about an extra few millimeters of thickness or an extra few tenths of a pound of weight. Especially if it doubles or triples the useful life of the device through a RAM/storage upgrade and easy/cheap battery replacement every few years.

> They prefer to leave the service to service people.

That's fine too, but Apple is actively against allowing a robust, price-competitive field of independent repair shops. And even if they weren't, soldering in the RAM chips and NVMe drives means the only thing those repair shops could do would be to swap out batteries and main boards, rather than do cheap, targeted upgrades.


The 12” powermac g4 was 4.6 lb.

Yes, modern Apple laptops gave up the “brick battery that slips out with a latch.” But in place of that we have Terraced internal batteries that can fill any volume


Huh? Most sold phones probably have a microSD slot.

It used to be the case I would swap MacBook parts out more when laptops were simply clunkier maybe 10 years ago (added a SATA SSD back in 2010). But lately with 16 GB RAM being fine for most developers even, NVMe SSDs being standard, Macs still being lame for GPGPU tasks besides mining cryptocurrency, and a lot of compute intensive tasks simply put into the cloud ... I can’t really say I need to do many hardware updates on a Mac besides perhaps the battery anymore. What really helps me most now is battery life and better screens to reduce eye fatigue, and that’s basically what Apple has done fairly well compared to the PC laptop market.

I’m kind of curious to see how usage goes for AWS Graviton 2 instances in the future as Apple laptops transition and there’s less friction to deploy along ARM based tool chains and artifacts.


Last time I had a laptop with replaceable ram modules, there weren't any bigger modules on the market and the cpu memory interface had a maximum capacity equal to the installed amount. I'm not sure expandable ram in laptop is all that useful, unless you intend on getting the lowest spec version and upgrading it later.

> Being able to swap out RAM and storage and battery from factory-provided can extend the lifetime (and even increase the performance capacity) of a given machine at nothing more than the cost of the new parts.

While that is true, how many users of any kind of device actually considers that an option these days?


Changing the hardware or software yourself is not the point. The point is the monopoly of the manufacturer to repair and modify with huge negative consequences. The lack of right to modify the software leads to unjust power over users [0,1], while the lack of ability to repair hardware leads to unreasonable repair prices, monopoly and environmental damage [2].

[0] https://www.defectivebydesign.org/

[1] https://www.gnu.org/philosophy/free-software-even-more-impor...

[2] https://www.ifixit.com/Right-to-Repair/Intro


No, that isn't the point either. The point is happy users, and there will always be some unhappy users because they wanted to do something that isn't supported or actively barred.

Throwing in words like 'right' and 'unjust' helps in movements and call to action etc, but in this context it's about someone turning on their device, doing some work, and turning it off again.


Since I can't reply to the replies, I'll reply to myself:

All of those points are fine, but they are points separate from a user using a device.

If you want it for environmental reasons, say that. If you want it for freedom, say that. Both are valid reasons but don't have anything to do with someone using their machine, unless that 'usage' mains doing stuff with the insides. And before someone jumps in to tell that that is the main activity: great, but most people buy those machines to do work that doesn't entail opening it up.


I am not allowed to use a browser I want on iOS. Do you count this as “using their machine”? The reason is the lack of freedom to do what I want with the software.

Not sure what that has to do with hardware but I'll bite: users don't care. They don't know what a browser is, what a client is, or an URL or URI or address bar. They want a device that when they turn it on and type in 'tictoc dance dog' it shows them something that resembles what they had in mind. Doesn't matter if it runs WebKit or Gecko or if someone thought it'd be cool to emulate a trident-based engine on iOS on ARM in a custom browser.

Freedom 'to do what you want' has not had much to do with what works in the world of selling to the masses when it comes to user experience (and yes that is ironic).

Regarding 'using' and 'their' and 'machine': I meant an activity based scenario which is what most people see themselves doing. They mostly have no concept of ownership, actions or device semantics. The reason is simple: it's not needed to be a user and it's not needed to get the same results as other people in your group have. And for a lot of people all that matters is doing the same as the rest of the local group.

Edit number 3: you can apparently select your default protocol handler in iOS 14. I bet there are some double digit Google users that want that for Chrome, and a few sub-1% users that want FireFox. But that's still not freedom because they all use the same renderer and JS engine. On the other hand: it's unlikely that people care about that, and the people that do are not likely to run iOS at all. Luckily, you don't have to use iOS. Or a phone. So some aspects of freedom are unchanged.


> users don't care. They don't know what a browser is, what a client is, or an URL or URI or address bar.

All true but irrelevant. The problem is that artificial lack of choice prevents competition and innovation. Microsoft from the 90s is here again. And users don’t care, again.

I do not count another UI as a new browser. It should be different under the hood.


> I do not count another UI as a new browser. It should be different under the hood.

Can you imagine any general user ever repeating that? I can't. As those users woud say: nobody cares.

I personally would care, but I'm not the main marketed target for Apple. I also care more about getting stuff done than what engine my browser uses.


It doesn't matter if a general user would know about this stuff. The point is that they are being hurt by lack of competition and don't even know it. Even if they wanted to stick with Safari on iOS, that version of Safari would likely be better and have more features if it were forced to experience some competition.

This is why healthy competition and anti-trust actually matters. It's not because your average user cares about the details, it's because they're being impacted even when they don't know it.

This subthread has a fairly trivial example of this, but when you get into right-to-repair it becomes even more important. Your average consumer could be spending a lot less money to have a much better experience. The fact that they don't know about that is part of the problem; lack of knowledge doesn't make the issue irrelevant.

And we don't even need to get into the environmental arguments around reducing waste to consider this aspect of the issue.


> Not sure what that has to do with hardware but I'll bite: users don't care

Sure, if you keep a population ignorant, they won't care about the pettiness of their lives, as they don't know about freedom.


Those "happy" users are just ignorant users who don't know what to do with their money. Their happiness wouldn't be any lesser if Apple would provide ways to fix their laptops.

You should watch Louis Rossmann who is doing repairs, how purposefully Apple makes any repairs hard.

[1] https://www.youtube.com/watch?v=-uYUB8DZH2M&t=2649s


What you are saying is the same thing. “Unhappy users” are unhappy because of lack of the freedoms I mentioned. One has to fight for them.

Considering the environmental damages that this causes it should indeed be a right.

> I literally can think of no reason I personally would want to do this

If you personnaly care a little bit about the environment there is a pretty evident reason why being able to swap a single piece of hardware is better than replacing the whole unit because everything is soldered together.

Right to repair would also mean that once your device goes out of guarantee you would have more options to get your device repaired for cheaper.


Not only that, but also:

* Apple charges +$800 for 64GB of RAM, which on Amazon would cost you around $300-350.

* Apple costs $1600 for 4TB of SSD, which on Amazon would cost you around $450-600.

* Soldering the SSD to the motherboard is an enormously horrible engineering decision for both servicing, backups, and data recovery.


Exactly. I feel more of us should speak up like this and not let ignorant users or shills try to sell this message that soldering down components and making it extremely difficult to replace parts like a battery is a "great thing" being done for the benefit of the consumer.

Reliability is far more important than repairability. I don't need to worry about replacing components if they don't go wrong.

> Reliability is far more important than repairability.

And why do you think they are exclusive to each other? I have a, I kid you not, nearly 15+ year old washing machine that I am still using. I had to replace 2 parts, due to wear and tear, in the last 5 years and it still runs as good. I am glad that I could repair it for a fraction of the cost than dump it and buy a new one.


Thinner devices are great for the consumer.

> Soldering the SSD to the motherboard is an enormously horrible engineering decision

And a good sell-more-laptops business decision?


You assume that people have to throw away millions of laptops, but his comment (and mine) refer to the fact that a recent laptop works fine for 6+ years as-is. The lifecycle doesn't shorten but the need to toy with the innards has gone away for a lot of people.

> You assume that people have to throw away millions of laptops

Hmm yes I do: https://twitter.com/RDKLInc/status/1275100376350384132

Have you also never had anyone you know spilling liquid on their keyboard after the warranty expired? Or dropped it?

So in the end yes, I am pretty sure that there are literal millions of devices being thrown away because repairing them is too expensive.

Edit: I'll also add that in 6 years using a mac, Apple replaced my macbook entirely twice for some issue that you would hope could be fixed without throwing everything away. So, sure, the customer is happy because they got a brand new device. But personnally I would rather stick with my 1 year old device and reduce environmental impact.


Those are not laptops, those are iPads, which weren't modular to begin with. Also, not broken, but locked, because the previous owner didn't unlock them before discarding them. Are there millions of devices that get thrown away in general? Sure. But I doubt it's only devices with RAM that was soldered down and branded Apple.

Regarding spilling on keyboards: I don't know anyone personally who does that, in my local area people don't eat or drink next to their devices. I do have plenty of people I don't personally know that did ruin their computers that way and I have replaced plenty of Apple keyboards and entire topcases to solve it. Works fine as a side-business.

Repairing is not the same as upgrading, and recycling is not the same as throwing away, and neither is reusing. The difference is important, as the solution changes with the case.

Overall, this would be more fitting in a general discussion on environmental impact and not as much one about software, the hardware it runs on, and human behaviour. If you want to solve a big problem you need to generify instead of brand it as far as I know.

To dive in a little deeper: to 'fix' this, people need to spend less time on how beautiful their stuff is, or how cool they look, and go a more utilitarian route. But since the American economy is built around the opposite and the western European one has the tendency to follow it in some aspects that's unlikely to happy any time soon. Changing people is hard, it's also the best way to solve or fix or change anything.


> Those are not laptops, those are iPads, which weren't modular to begin with.

Sorry, I thought this was illustrative enough to make my point, but if we have to be pendantic then here you go the exact same thing but with Macbooks:

https://twitter.com/RDKLInc/status/1251533085734252549

> in my local area people don't eat or drink next to their devices

Interesting local area you have there, at my office and all the offices i was in the past everyone has a cup of coffee right next to their laptop.


> https://twitter.com/RDKLInc/status/1251533085734252549

Again, two MacBooks and not millions, as explained in the twitter thread. Again, not defective but locked by the previous owner. Not even remotely related to 'right to repair'.

As an owner, I want to be sure that when I lock a device, it's actually unusable to anyone else. So this is good (for me).


> As an owner, I want to be sure that when I lock a device, it's actually unusable to anyone else. So this is good (for me).

You want your data to be inaccessible. Not providing a way to factory-reset a perfectly functioning device (say, by replacing a modular storage) is just irresponsible.


No, you want it to be unusable, because this greatly reduces the incentive for theft.

Well, I want my device to only be usable to me unless I say otherwise. When I sell it I'll sell it unlocked.

> I am incredibly happy with every update from Apple.

I am incredibly disappointed with every update from Apple.

I bought my MacBook Pro in 2011. The most basic 13.3" machine that was available for sale. 320GB 5400RPM HDD, i5, 4GB RAM, Intel graphics.

I like backups. The optical drive had to go, I put it in an external USB enclosure. It was replaced by the HDD and in its place I put a Samsung SSD. The HDD contains my data archives and a macOS installation image.

Whenever macOS would become cluttered, I would wipe the SSD and enjoy a fresh macOS installation. Whenever a new macOS version would be released, I would update the installation image.

This enabled me to work on the go with no fear of data loss and no network dependencies.

SSDs are basically long lasting consumables. I had to painlessly replace the drive twice, however each time I got a faster and more robust unit with enhanced capacity. Sure I can carry an external drive with me. I have no desire at all to do that.

I like the mini jack, so that I can plug my favorite headphones. The audio-out port also doubles as an optical-out. Sure I can carry a couple of dongles with me. I have no desire at all to do that.

Sure I could have bought the top of the line 8GB RAM model back in 2011. I had no desire at all to do that. I could just buy 16GB of RAM and perform an upgrade Apple claims is not supported.

The battery lasts long enough. While coding, I get drained before the battery does. I can let the battery age with no fear of it bulging because it's contained in a shell. Replacement is easy.

I totally get where you are coming from, and agree with you that "a tiny fraction of users" want access to the hardware. However 100% would want access to the hardware when their 16" MacBook Pro SSD goes.

Don't professionals require machines that cater for their needs? If the current Apple offered choices like the ones I mentioned above, would you not pay a premium to have them?

I certainly would. And for that reason I still rely the same 2011 MacBook Pro and not the latest MacBook GoodEnough.


I upgraded my current laptop from 8 to 16 GB to 32 since 2014. I replaced the 750 GB HDD with 1 TB SDD. Then I replaced the DVD with another 1 TB SDD. It was as easy as slide the bottom out, loose a couple of screws and that's it. Ok, I concede that replacing a worn out keyboard was not as easy as on YouTube but I like serviceable hardware.

For me the problem can be summed up as: "Every new MacOS release has less Unix and poorer (for a power user) design".

Apple has been slowly whittling away at the MacOS value proposition for developers.


> Apple has been slowly whittling away at the MacOS value proposition for developers.

Yes, I think the success of the iDevices have made them change focus. They are no longer interested in making devices for the power user. They now make more money selling iDevices, especially recurrent revenue. And so the ignorant consumers are now their target market.


This makes no sense! Storage and RAM requirements have steadily increased, and costs have gone down dramatically. On the other hand, keyboard technology has not improved massively, and CPU speeds aren't significantly different.

I'm not bottlenecked on those components.

I think on my 2016 MacBook at home, the only time I get bottlenecked on performance is CPU while playing EU4. Oddly, all the powerusery stuff I do could probably be done equally well on a 10 year old device if it wasn't for USB-C.


> I am incredibly happy with every update from Apple.

You are happy with the sluggishness in Catalina from security checks over the network before running applications?

> Sure I can't disassemble my laptop and change the hardware. I have no desire at all to do that. I literally can think of no reason I personally would want to do this

The cost of repair is higher because independent shops cannot repair the boards anymore or recover data when your storage, memory, and everything else is soldered directly to the board.

I don't doubt that you enjoy the platform, but I would be hard pressed to say that I would be happy with everything from any tech company.


> I am a developer and power user ... Sure I can't disassemble my laptop and change the hardware. I have no desire at all to do that.

That's a lie. Every power user tries to get the most out of their machine, and so desires an upgradeable system. No power user likes to throw away a system because s/he can't replace its battery or upgrade its ram or hdd / ssd. At some point, everyone likes to upgrade their existing system without spending a gazillion bucks for it.

Yes, it is true that with phones and tablets, corporates like Apple and Samsung have managed to convince the general mass that not being able to upgrade your phone or tablet is a NORMAL thing, and soldered RAM and soldered SSD are the only option, but we power users know that's a lie that Apple and Samsung (and others) would like us to believe.

And that is why their marketing is heavily based to push the exact message you are parroting:

> I imagine there are a tiny fraction of users who would and you may be among them.

I can bet that even if someone is not a poweruser, and an ordinary consumer / user, they too would appreciate the option to upgrade their phones / systems and reuse it. That is why we have efforts like the PuzzlePhone in europe.


Wow. He doesn't want to disassemble his laptop, so he's lying about being a power user?

Have you ever heard of the No True Scotsman fallacy?


You don't need to disassemble a modern laptop or a PC to upgrade RAM, battery or HDD. Only most Apple systems need such crude and difficult disassembling to do such simple tasks of replacing these parts because they are deliberately designed like that.

Hell, even an average user would likely appreciate being able to bring their laptop to the local repair shop, give them $200, and walk out with the same laptop, but with more RAM, storage, and a new battery.

The only current option is to drop $1500 on a the new model. It's not only more expensive, but wasteful.

I guess there are a lot of younger folks around here nowadays, but it was perfectly normal in the 90s and early 00s for regular non-technical home users to get out a screwdriver, open up their desktop PC, and replace the RAM, add a new/larger HDD, add a CD/DVD drive, even swap out or add a graphics or sound card. And there was a perfectly functioning computer shop industry that would do that for you on the cheap if you weren't comfortable.

In the past 15-20 years manufacturers have gotten us into the cycle of buy->discard->replace to the point that people don't even know it was ever different.


Every time there’s a new Apple announcement, I’m happier and happier that I switched to Linux a couple of years ago.

Power users want freedom and control of their own machines and hardware. The moment that Apple started soldering parts into place, I started looking elsewhere because it was a signal of the thought process.


> The moment that Apple started soldering parts into place, I started looking elsewhere because it was a signal of the thought process.

Very true! The first thing I did after buying a Mac Mini was to upgrade the RAM and change the HDD to an SSD, and even added an extra HDD. The Apple options for the same were nearly triple the cost. Only an idiot would believe that having soldered CPUs, soldered RAM and now soldered SSDs is a good thing. It is a good thing only for Apple, and Samsung et. al bottomlines.

And with ARM processors, you can be sure you'll get a PC with locked bootloaders that won't allow you to even install other operating systems. And the offered OS will also be a dumbed down system like ios that will only allow you install from a locked app store. All the while leeching off your personal data to the cloud.

I too have started exploring other alternatives.


On the flip side: does anyone still buy a low-specced machine and upgrade it mid-lifecycle? I've seen it less and less over the last 6 years and the last 2 years it hasn't happened in any work setting.

I know people like to have the feeling of control (do you really control your laptop if you can't replace your EC, CSME, AGESA, SSD FW, NIC FW, VBIOS) and the concept of shuffling parts around, but the need for that (at least in my circle) has pretty much faded.


Why wait for mid lifecycle? Buy a low-specced machine and the drop in larger storage and memory rather than paying a huge premium to the vendor. It's not about control as much as just saving money.

Mid-lifecycle because saving 10% on an expensive device isn't what most people are interested in when they are buying something to work on.

Yes, people exist that do that or want that or really need to save every cent possible, but to many others that is not even interesting as a thought exercise.

So I referred to mid-lifecycle because those people that initially don't try to go on the cheap usually only did any part swapping because the machine was good enough and swapping machines was too much of a hassle: upgrades made sense.


> On the flip side: does anyone still buy a low-specced machine and upgrade it mid-lifecycle?

I've upgraded hard drives and RAM on very single Mac I've ever owned, except the last Mac I purchased, a 2015 MBP, where it wasn't an option.

It's looking like this one will be literally the last Mac I purchase. Apple's practice of charging premium, bespoke prices for commodity hardware was insulting, but at least there was a reasonable workaround before they started the soldering crap. Soldering things down is just a brazen money grab.

Can't believe I'm looking at going back to Windows but here we are.


Same here. My 2012 MBP has been upgraded with more RAM and a SSD and the battery is replaceable (but it's lasted better than my work 2017 model and is healthier!) but the cost of the new devices is so astronomical it'll be the last one I buy. Since I will have to invest in a new ARM device just to develop for their new platform, it's goodbye Mac and back to Linux (already got Windows as a side Dev machine).

Apple is making it a pain to do even basic things like replace the battery, which is a consumable part.

I've personally upgraded the hard drive and replaced the battery twice on my old 2011 MacBook Pro.


I have done that on a 2009 as well, but that is a slow machine at this point. I don't think I've used it beyond 2014. I have a 2015 around that I still use and works perfectly fine, but so does the 2018 which is lighter. Neither need upgrades or changes to do they work they are doing.

I did upgrade someone's 2015 in 2018 with a bigger SSD because they needed 2TB and it wasn't available at that time (they were on 512) and they used it as a local buffer for footage before offloading. Because they only have USB3 or TB3 drives and no TB2 drives, the multiple transfers on that MBP are too time consuming to do in the field.


Why does it have to be low-spec? I bought my current laptop with 16GB of RAM in it (soldered to the mainboard, sadly) because that's the most the manufacturer offered at the time. If I had the option today, I'd open it up and swap out the modules for 32GB. (I think they now offer a 32GB model.)

Ditto for the storage; I have 512GB in there now (probably wasn't the max when I bought it, but seemed sufficient and was a good price trade off), but I wouldn't mind swapping it out for a 1TB part. Actually, this bit might be possible to do myself; I should look into it.


Apple usually has a large premium to upgrade the specs. If you can change your own memory or SSD you could save a considerable amount. But now they are soldering everything.

That is only something people take issue with because it used to be modular. You don't see people making the same argument with phones, printers, iPads, TVs, Cars, Smartwatches etc.

I'm not saying soldering things down is ideal, but it's only a problem because people perceive an option that was there but now isn't as 'bad faith' or 'taking away' something. Imagine the riots if we used to be able to change the CPU in a laptop. (in theory we 'used' to be able to but in practise that never worked well because the microcode didn't fit in the firmware and the cooling solution only takes a small amount of TDP variance - popping in a faster CPU meant it overheated so quickly it was useless)

Would it be cool if all devices were composable? Sure. Is it something users care about? Not really.

I'd like it if Apple found a way to allow swapping of components within the design envelope constraints they create d for themselves, but it also won't have an impact on what I'd buy.


> does anyone still buy a low-specced machine and upgrade it mid-lifecycle?

Yes, they do. Sometimes a technology like SSD itself gives a great boost in performance equivalent to a new costlier system.


I’ve certainly bumped up the RAM in the past.

No, this is not what I want, and I’m a power user. I want my machine to solve my problems and not get in my way when I am creating solutions. I do not want to be messing around with the internals, I have better things to do . I pay for it to just work in an environment I hope is vaguely private.

> I want my machine to solve my problems and not get in my way when I am creating solutions.

That's literally why I switched to Linux in 2018, after a decade on OS X.


Linux does not solve my problems (as a power user with Linux). It does not work with the proprietary tools I need to do my job and I’ve broken apt way too many times. I’m not remotely interested in debugging this at this stage of my life.

How are you managing to break apt not once, but "way too many times?"

I haven't used Linux as a main driver in a decade but I do use it very frequently and I am intrigued by all the many ways people online seem to be able to break it. I have never once run into a problem with apt.

It really makes me wonder how much time people who complain about Linux have really put into learning it. I'm not thinking you need to read a book on it, but after a few months of just using it I would think one would be reasonably proficient?

I understand the claims of needing proprietary software. I, too, with that the Office suite was a first class citizen. That said, it absolutely baffles me the amount of weird ways people have issues.


> How are you managing to break apt not once, but "way too many times?"

Raises hand my home server (running Debian) is currently in a weird state where a whole bunch of packages won't install or update because some library's stuck on a version they don't like. The version they want isn't available. I'm pretty sure I haven't even added any nonstandard repos to it (all it really runs is docker, so anything odd comes in as a docker container rather than a system package—this in part because I don't trust Linux package managers not to break my system when installing non-critical software due to long experience of same happening over and over, go figure). I'm not sure what I did to cause it and it's still running Docker fine so my incentive to spend an hour tracking down and fixing whatever-it-is has been zero—I'm just glad it's not a system I depend on for anything serious or need to really do anything with.

I spent a few years running Gentoo as my main machine in the 00s. On a laptop. Got suspend-to-disk working, even. I had to manually install Grub on this same Debian home server when I built it last yer, in a chroot to my newly-installed system, because the installer kept failing to do it (my process to fix it was very ordinary and encountered no problems, AFAIK, so I still don't know why the Debian installer couldn't do it, I've never seen it fail quite that way before, but it did so repeatedly so I had to give up and open up the engine, as it were). So I'm not entirely clueless.

Nonetheless I don't really feel confident using a Linux desktop I can't snapshot for emergency rollbacks or rebuild in a few minutes from a script, because damned if weird problems don't crop up when you upgrade. Or don't upgrade. Or reboot, forgetting you'd installed (through the blessed package manager!) a new kernel and now it can't read your encrypted root anymore so is unbootable and now you get to spend some time figuring that out. And so on.

I feel no such anxiety on macOS. Not that that it never breaks, but it's rare enough I don't worry about it. FWIW I'm back on desktop Linux now due to Macs' insane prices, but I suspect I made a mistake and the time I've already lost would have made spending an extra $500 on a significantly worse-spec'd machine well worth it. It's a little here, a little there, but it adds up.


Fwiw, I actually manage my linux laptop with Ansible.

I can't tell you about all of the times, but I will tell you about one time it broke and why. I did a debian clean install. Then I added the Spotify repository. Then I apt-get spotify. The script hung and then apt got into this weird loop. I had to add exit 0 to a script to fix it. This isn't what i want to do. I have over 10 years experience with Linux. I've even done Linux from scratch. I learned it already, this is not the problem. The problem is that Linux doesn't have stability currently that satisfies my needs and life goals.

Proprietary tools, I get. But "breaking" linux is something I don't buy.

The last T series Thinkpad has soldered RAM. Granted it still has one non-soldered slot, but still. It feels like the world is closing for of us that value power and control.

As does most other high end laptops....

There’s a lot more options for high end laptops for Linux.

Apart from the Dell XPS13 what other options are comparable to the MacBook Pro?

I used to use macOS because it was a Unix that looked good and just worked, but both observations have become less and less true with time.

In that same time, Linux really caught up and diverged such that it offers features in its kernel, like cgroups and kernel namespaces, that macOS doesn't match, leaving me to feel as if the old FreeBSD roots of macOS are less of a perk and more of a kludge in 2020.

Today, I run Linux on my MacBook because it is a modern Unix clone that looks good and works well with all of my development tools. Native Docker is a treat.


What flavor or unix if I may ask? Do you have driver issues?TIA! Cheers.

I'm using Kubuntu, and I haven't had any driver issues yet, but I'm not doing anything crazy with hardware or using proprietary drivers. It's been a pretty pleasant experience.

I've always used Apple products because they just work. UNIX compatibility has been great and I've run docker, run virtual machines, given presentations and I love the hardware. Most of the restrictions are tradeoffs for security and the real problem is that I have tried having workflows on other systems and tried to switch off of it (I had linux on my MacBook Pro 2017. Below are my most frustrating situations

1. My MacBook Pro 2017 had some weird problems due to voltage of the CPU. Somehow they misplaced the laptop but they did a full top case replacement eventually. It was weird when someone came out of the store with handcuffs while I was in the Apple store. With this Laptop I wasn't disappointed by the keyboard or the Touch Bar and never experienced reliability issues. I sold it due to frustration but I shouldn't have done it. 2. My AirPods have been replaced twice due to buzzing. This could have been me not cleaning them and sometimes I still experience buzzing sometimes but cleaning them seemed to work. I got both earbuds replaced twice 3. iCloud got messed up on my MacBook 2016 when I installed a beta version of Catalina and that was my fault had to go on support and they never got back to me about an engineer looking at my iCloud. 4. My MacBook Pro Retina (around 2012) had a zebra pattern on the monitor. I took it to the Apple store twice and they just gave me a new laptop.

With all of these issues over the years Windows is much more pain to even use. On my Gaming computer Wifi still can take minutes to start. I had to upgrade it to Wifi 6 and that may have fixed it with an external wifi.

My current Laptop (MacBook Pro 16) has good thermal management now. Plugging into a monitor does increase the thermals to 60 C . Sometimes I turn off turbo boost. Other than that having a laptop with 64 GB of Ram, a 5500m graphics card is pretty nice especially with a 2.3 GHz 8-Core Intel Core i9. Somehow I actually was able to refine a few iterations of GPT-2 on the CPU but that was by mistake when I was cleaning data.

I like the Touch Bar because it lets me not change to using the trackpad sometimes due to RSI. I also have an external trackball mouse.

Wifi and bluetooth open with no issue. System is really responsive even compared to my Windows machine. Screens have always been amazing and color accurate. Emacs runs great also. Terminal.app keeps on getting better. UI looks like it's going to be cleaned up in MacOS 11. Having ARM is going to be very exciting and Metal is even more exciting and might be the only thing that will start to compete with NVIDIA.


Interesting you mention Emacs. The Mac port of Emacs runs noticeably slower than Linux depending on operation (magit especially is very slow). This is due to the much slower fork call on OS X I believe. I found Emacs so painfully slow on a MBP 16” I gave up on OS X completely. It might just be you are used to it and it’s good enough. For me OS X feels like a laggy mess.

I have experienced Linux being more responsive than macOS when I run it on my gaming / deep learning computer or when I loaded it on my MacBook Pro 2017 . The problem was the rest of the OS didn't work or was a nightmare. On a MacBook Pro Linux would never suspend correctly. On my Gaming / Deep learning computer Ubuntu keeps on updating the kernel and ruining my work installing linux to the point I gave up multiple times. And by the time that Dell Developer Editions started being competitive I would be SSHing into AWS or doing something completely different.

My workloads are getting to the point that a Colab Pro Notebook sounds like a good idea or setting up my Gaming / Deep learning computer to use RDP into it using Windows 10. But by now I would probably even want to install Windows Server 2019 due to the fact that the NVIDIA drivers would support the card and when I run video games I don't have to manually kill background services that subtract 30 FPS from my video games making them unplayable on a Titan RTX.


Docker on macOS has been pretty bad for me, too...

This somewhat echoes some of Steve Job's own complaints about the Apple he was kicked out of: https://youtu.be/Gk-9Fd2mEnI?t=2078

Also note that at the time he was creating a NeXT, a Unix company.


Thanks for linking the video. Interesting to listen to Jobs -- seems to me too that he argues against what Apple is doing nowadays.

I actually 10 minutes ago had a look at how much a 32 GB Mac costs nowadays -- and it's like 2x more than a 32 GB PC. That's less than some years ago, then a Mac was more like 3x more expensive, if I remember correctly. Looks like the trend Jobs mentions in the video, about lower and lower ASP. (ASP = average selling price, right.)


I second this. My MBPs 2015 are the best machine ever. The only things that could make them better is upgradeable RAM (16GB can be tight with VMs) and an nvidia gpu for faster DL prototyping offline. Apple Unix has a long history (AIX) but MacOSX was most def the best OS ever. As Apple is moving away from this unix power users's paradise, MS seems to be moving in, with tighter and tighter integration, now offering easy GPU passthrough to the unix subsystem... Open sourcing the rest of 10.14 would be really awesome by Apple. (one can only dream)

I came to Mac because copy/pasta into a Windows CMD terminal is/was abysmal in 2010. I didn't know about Powershell at the time, so you don't know if that would have made life nicer.

The other reason I switched was because all of my developer friends knew nothing about Windows, and getting stuck on a maybe Windows problem meant I had to fend for myself when I needed help the most.


I have a 13' macbook Pro. Really is the pinnacle. I don't even know which osx it is, but it just feels a lot more 'serious'.

I'm not a fan of the new OSX, guess mobile is more prevalent and maybe apple are hoping the iPhone and ipad can be gateway drugs into macbooks / desktop by being more of the same.


Maybe someone more knowledgable on Apple history can correct me but aren't MacBooks being attractive to developers due to the Unix layer a happy accident? Apple never made a plan for that, and I don't think they really care more about that demographic more than before. I use Macs since nearly 30 years and I find it useful that I can now also develop on it, but I can totally envision switching for work, just like in the pre-Intel era. Apple always was about end-users, don't fight it.

It went all downhill once a computer became a smartphone / tablet and the common mass accepted that not being to upgrade them is normal. And now we have mac mini's with soldered ram and soldered cpus and even soldered ssd's and batteries that cannot be removed. The huge amount of waste that this generate is so upsetting.

Wish more efforts like the Puzzle phone in europe would bear some fruit.


> And now that balance has tipped away from the power-user, the developers, to the end-user

What do you mean by that? (Personally, I can't think of anything I can't do now that I could do five or ten years ago on my Mac. Yet a lot of things are better.)


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: