EDIT: I suppose most users in the Apple ecosystem aren't on Macbooks, but iPhones.
I've spent zero days, zero hours, zero minutes, and zero seconds fiddling with setting anything up in macOSX over the last 15 years.
I manage a bunch of Linux systems, from laptops to desktop to servers, and over the last 15 years I've probably spent months of my life setting up stuff that should have just worked.
I use a mac at home and as my dev machine, because I charge quite a bit of money for doing that for Linux, and I don't want to throw my own money like that during my free time.
> Itunes, Mail, etc, none of the software shipping with MacOS is actually any good.
I use many Microsoft Office Apps, Adobe suite apps (mainly photoshop, lightroom and illustrator), and all of them just work.
VNC clients, VPN clients, Samba, they all just work. External devices, no issues. Do I want to stream my display over wifi to my television, or my audio to the stereo? Just one click away. Do I want to restore the back up of one directory from 2 years ago ? Its three clicks away.
On Linux, half the steps to each of these things are filled with hours and hours of research in the Arch wiki. Do you want to stream your monitor to your TV? Start by setting up a media server, and then 200 steps. Do you want to restore a partial backup of some folder in a BTRFs drive, a dozen steps that I cannot remember. Like no way. I don't want to remember how to do these things, and I don't want to have to look it up. If nobody is paying me to do these things it should be completely obvious how to do them and it should just work.
But hey, do convince your employers to use Linux everywhere, its great for the business.
In many way *nix is still stuck in 1983 because it's a craftsman's platform, which is why there 10,000 distros, oops 10,001, no 10,002, oh 10,003, ugh! Linux is great on servers, because all servers require craftsman for the care and feeding, and Linux is so much better than Windows in that regard.
I gave up my Linux machines because I'm no longer interested in being my own sys admin. Frankly I was done being my own sys admin back in the 90's but I felt the need to keep my propeller spinning, and stay true to my geek roots.
In 2008 I bought a Mac Pro 8 core machine, and that's still kicking, my son uses it for Machine Learning as part of his graduate program, and it's just as fast now as when I first bought it. It was left behind with the last OS upgrade, but it will work just fine for another 4 or 5 years on the old OS.
In 2009 I still had a Linux laptop, but it was painful back then. With SnowLeopard it became easier to create an maintain a Hackintosh then it was to manage a Linux laptop I was off Linux for good.
For a hobbyist(craftsman) Linux is great, for a server that needs craftsman, Linux is great, for a consumer, buy a Mac.
My reason for giving up was reaching the conclusion that Linux distributions would never replicate the expectations of an Amiga/BeOS like experience for those that care about multimedia.
The Linux experience depends on the distribution, software and hardware you choose. You need to choose carefully.
So here’s the interesting part; can you recommend a distro that is universally the right choice for dev work?
If yes, please tell - I’d love to know!
If the answer is “well it depends a lot on your needs” that’s kinda the point of MacOS right? The fact that making the right distro choice requires serious thought is already problematic (hypothetically)
I have to say that I 100% do not believe this statement.
You mean you you've never used any wireless connections to your machine?
You've never had to set up backups or change audio settings?
You've never had to deal with setting up one thing in 15 years?
Since when did OSX gain the ability to read minds?
You seem to confuse using the machine with setting up.
Connecting a wireless connection (picking the SSID, entering the password, etc) is not "setting up". It's merely regularly using the computer.
The setting up we'd rather avoid, and that the parent talks about, would be tinkering to get your wifi chip working with your OS -- which you get the "pleasure" to do on Linux...
I was glad to return to Windows when the Linux-specific issue was solved.
When it hangs, it finally exits with an error message saying it couldn't mount /boot/efi. Once I log in I can see that /boot/efi is mounted fine.
A bunch of my applications hang all the time for reasons I couldn't figure out. Turns out they were trying to communicate with my Gnome keychain over D-Bus, but they were in a separate D-Bus session so they just sent their D-Bus message and then waited for however long until they gave up. No idea at what point this started happening or why, but I never did figure out how to fix it (other than disabling those apps from being able to store passwords securely).
If I used the Intel drivers for my Intel GPU I got extreme stuttering and tearing, which only went away when I uninstalled the Intel drivers entirely so that it would use the "correct" drivers.
Then there's snaps. I install stuff and sometimes it's a package, and sometimes it's a snap. If it's a snap, everything turns into a giant mess. I had Firefox saving its downloads to Skype's workspace for some reason, multiple Firefox profiles on my system in various places, multiple versions of Firefox running when snapd downloaded a new version and symlinked it to be the default, two copies of Telegram for some reason and no obvious way to tell which was which, a snap and non-snap version of things because the Store installed the snap but another package depended on the apt package, and so on.
I installed my system on btrfs because "it's like a decade old now, it should be fine right?" and within a month my filesystem had been completely corrupted and I had to reinstall entirely (though I managed to salvage most of my home directory at least).
This is all stuff I'm willing to tolerate to some extent, but I can't think of a single thing that Linux offers me that makes it worth all this hassle. At my last job, I happily used a MacBook for nine straight years, and I never lost an entire day of productivity just because suddenly my system won't boot, or dealt with graphical glitches in every app because the driver for my video card isn't the right driver for my video card, or lost data because one of the default options for installation filesystem is unstable and unreliable.
I've never had to do much of anything to "tweak" MacOS unless I wanted to; on Ubuntu, I have to do it almost every day to get something working well (or at all).
So I'm gonna go ahead and say that yeah, MacOS isn't even remotely dead, and it's certainly not going to lose to Linux any time soon.
> You mean you you've never used any wireless connections to your machine?
Yes: I clicked on the network, typed the password, and it worked, always. If the machine already has seen the network, i don't even have to click, it just connects. If one of my machines (e.g. my iphone) has seen the network, the network and its credentials are stored on iCloud, so my laptop and my tablet can just connect to it. I never had to google how this works, never had to install anything to make this work, never had to change any configuration file to make this work, etc.. Same for VPNs, VNCs, corporate networks, network drives, etc.
> You've never had to set up backups or change audio settings?
Yes, I clicked on enable backups, clicked on the box that enables daily back ups, clicked on the drive where I wanted to store them, and it worked, always.
I can restore, browse, partially restore, backups with one click using time machine. I never had to google how this works. It always worked.
> change audio settings?
Yes, I clicked on the audio button, and volume worked, muting worked, all 4 sound outputs worked, always worked, etc.
On linux, over the last 15 years, i have googled at least for days to do _each_ of these things for customers:
- can't connect to a particular wlan, corporate wlan, etc. needed new drivers, upgrade kernel, change some etc config to enable something, etc. Same for network drives, cloud drives, SMB, ...
- sound, damn sound: sound cards not being detected (hell on my lenovo yoga an update to 19.04 Ubuntu pushed a kernel patch that broke my sound cards driver: no sound card detected, still not fixed on Ubuntu 20.04 5 months later - no sound in the times of corona and web meetings - great!), alsamixer, etc files, pulse audio, .... I have had to google a lot about each of these things, and fiddle a lot over the last 15 years, on dozens of machines, for dozens of users, for hours.
- backups, on Linux, have fun. I've set backups for ext4, zfs, and btrfs. ALWAYS had to google for hours. Every time I had to restore a backup, had to google for hours. Some times restoring did not work. Some times backups did not work. Encrypted backups do not work out of the box (Apple's full-disk encryption is enabled by default - but enabling it and having work with everything is just 1 click - on Linux full-disk encryption is a mess).
So no, I never had to set up anything on MacOS X in the last 15 years. The things were in the obvious place, and I just used them, and they just worked. No need to even google how to do the thing, or what to install to get it done, or configure the install, or read bug reports to work around issues, nothing. They just damn worked, as they should.
Its 2020, and setting things for companies using Linux is my day job. These people would all save a ton of money by just using macs or windows machines. The amount of time their employees waste on setting up and maintaining their machines instead of doing their actual work is astonishing.
At this point I'll probably switch to Linux as soon as my 2015 Macbook stops working, but I would've really liked to stay with Apple.
It always surprises me when people cite homebrew as a plus rather than a minus in these discussions. Compared to the experience I have with Linux package managers, I've found homebrew to be extraordinarily slow even at the best of times. A simple "brew update" will generally take at least 30 seconds; in that time on something like pacman, I'll usually not only have already finished fetching the list of updates but also actually performing the upgrades, and that's updating my entire system, not just a few userspace packages that I have installed. I can understand that some people prefer MacOS and I don't think they're wrong for doing so, but it's hard for me to believe that someone has given Linux a fair shot when they cite package management as something that Macias does better.
MacPorts is well worth a spin if you're on a Mac and have trouble with Homebrew. I switched sometime last year and am very happy. I think I've actually been using it more because I know it's not going to be a pain every time.
The biggest argument at the beginning for Homebrew over MacPorts was binary distribution and now MacPorts has that and can still compile custom versions if necessary.
MacPorts is always the first thing I install on a new Mac and has been for over 15 years. The several times I've seriously tried Homebrew, I get frustrated (terrible jargon, missing packages, spewing files all over my system, lax security) and go back to MP.
I'm up for giving MacPorts a go. But is there a pain-free way to transition between Homebrew and MacPorts, without having to reinstall everything over again? I've installed a load of stuff with Homebrew, over the years and don't fancy having to pick through it all again to try and remember what I installed and why.
If you're using Pacman (and presumably Arch or some derivative), that might explain your better experience on Linux. You're on a rolling distro so you get up to date packages (and I believe AUR gets things even quicker).
Homebrew is much closer to that rolling always-up-to-date (within a day or two usually) experience than Apt or Yum on Debian/Ubuntu/Fedora (most people's linux experience), where you have a choice between packages that are 3-6 months out of date or hunting around on the web for a 3rd-party repository to add which contains more up to date packages.
Homebrew can be slow. But the fact that I can `brew install X` or `brew upgrade X` and get on with something else while it's doing it's thing means it doesn't generally take up much of my time.
It still has a centralized list of the versions of the packages it supports (on github) though, right? I feel like it should be able to get this list with a single HTTP request to the Github API; I don't see why having a faster or slower schedule to update the packages should affect how long it takes to actually download the list of updated packages with `brew update`.
This is subjective. I think the MacOS UI looks and acts like a toy. My Linux machine has a solarised theme (I can toggle light/dark) with minimal window borders (i3, polybar). I think it's much more tasteful than MacOS. GNOME looks better too and it's themeable, unlike MacOS.
But this is all of course just my opinion.
Same for GNU/Linux if you choose the hardware for it (or buy preinstalled).
What I do agree with is that on day one support might not be as good for new types of hardware (especially for more niche hardware like a fingerprint reader for example). Even that has improved a lot over the years though.
Thinkpads usually work really well out of the box (even the nipple mouse works without a hitch).
I have plenty of horror stories about other computers though. Usually involving being deep in some page tree on a company's website that they don't seem to have actually expected anyone to go to, trying to figure out which specific vendor id maps to the actual hardware I have.
I've been maintaining a Fedora desktop installed 8 years ago without a snag, updating every year or so as time permit. Not a single problem.
... until Apple decides to break an API and not provide equivalent functionality, and then you end up with a perfectly functional soft-bricked paperweight. I can't use my external USB monitor because of this.
In 2016, after getting tired of various hardware bits never working just right on Linux, I gave up and tried someone else's hardware. It was amazing that everything just worked with Linux. I didn't have to fight with bleeding-edge, just-reverse-engineered drivers. I didn't have to wonder if the laptop was going to wake up again after being put to sleep. But I chose somewhat poorly: the hardware I chose was decent, but rough. The battery expanded to the point where it warped the case, which caused the keyboard illumination to intermittently not work, and eventually made the touchpad impossible to use.
So I gave up in 2018 and got a MacBook Pro again, though it was a secondhand 2016 model. Once I installed Linux, I found that the keyboard and touchpad didn't work, and I had to build an out-of-tree driver for it, ditto for the WiFi. After that I found that audio and suspend-to-RAM didn't work. It turns out I was "lucky" that I had an older model; the keyboard and mouse setup in the 2017 and 2018 models hadn't been reverse engineered yet and didn't work at all.
In 2019 I got fed up with having to use the USB-C headphone adapter from my Pixel in order to hear sound, and to remember to shut my laptop down completely if I was going to be away from AC power for more than a few hours. I again left the Apple hardware world. It's not perfect, but again I marvel at how everything just works (with Linux) without tweaking. I hear so many colleagues complain about various inexplicable broken things on macOS all the time, and I think I've gotten the better deal.
Is Linux as polished as macOS? Of course not. I agree with you that the graphics aren't as smooth, font rendering isn't as good (though this is debatable; personally I find Linux rendering to be more crisp and readable than macOS), HiDPI support varies depending on what toolkit any given app was written with. But it works well enough. I think one of the biggest problems that has kept (and will continue to keep) Linux from going mainstream is that even though for the most part it works just fine, when it does break, it breaks spectacularly to the point that you need specialized knowledge to fix it. Fortunately I have that specialized knowledge, so I'm fine, but your average home user does not.
I don't have to deal with Apple's keyboard design issues, random WiFi failures (yes, it is incredibly funny to me to find that Linux WiFi can work better than macOS), Apple's overzealous "system integrity protection", or Apple in general deciding they know better than I do as to how I want to use my computer. It makes me happy, and I feel productive.
It is hostile platform, it should be rough. I use 14" Dell Latitude - no problems at all, beautiful machine. Probably would stay on it or try Asus ZenBook.
I have a Dell XPS 13 now, and while I like it, it's just not as pretty as a MBP. I recognize that that's a highly subjective judgment, but that's a thing that matters to me. The XPS's build quality is good, but not amazing; for example, I've had it for less than a year, and one of the rubber strips on the bottom of the laptop has been peeling off for the last few months.
Overall it just makes me sad that "hostile platform" is even a thing. At this point I've decided that the alternatives are good enough for me, and the better hardware support tips the scales in favor of them.
2) macOS is not a cloud OS. It certainly can be a portal to the cloud quite easily, via terminal, iterm2, ssh/mosh, VNC etc. But you know that already.
You can also run VMs, KVM, docker et al on it as well.
- HoudahSpot (search files / documents)
- DEVONthink (document management system)
- Ulysses app (a markdown writing app; I know there are many, on many platforms)
- iA Writer (another markdown writer)
- Alfred App (automating)
- Keyboard Maestro (automating)
- AppleScript (automating)
- Things3 (task management)
- PDF Expert (reading / annotating pdfs)
- aText (text macros)
- Timing App (auto time tracking)
and many more.
Many of those apps have their equivalents in the Linux / Windows world, but their Mac pendents are imho much better.
Finally, it's the overall experience. Everyting is streamlined, everything works across iDevices, everything (mostly) works. I can't say it in a few sentences. Sitting in front of this machine, coding, writing, listening to music, that's a joy. I didn't have this feeling with a Windows machine (was a Windows and Ubuntu user until 2015).
The UX, UI and 'feel' is what makes people use one thing over another. Unless you can replicate that, or can do better, nobody's getting swayed either way (and that's fine, do what you want).
There is no objective 'my pc is better than your pc' because you're not the same person as the one you'd be attempting to compare against.
Better hardware and better drivers are useless arguments if someone wants to go to a shop, take a device home, plop it down and do some work. None of the betterness of the other stuff is going to undo the work, or undo the experience.
It's like telling someone their 24-pack of toilet paper is stupid because your shipping container full of toilet paper is cheaper and you have more than they have. (especially when all they wanted to do was wipe their butt)
Say you have two options, same price, different vendor. And the main difference (for the sake of the argument) is that one comes with a mobile i5 and the other one with a mobile i7. Technically the price-to-performance ratio is better for the i7 model. But the driver for me to buy any device at all is wether the device can do the task I need it to do. If the i5 does the task the same way the i7 does the task, what is the extra value that the i7 supposedly brings?
Often people will make a 'what if' argument here, because "what if you need to do something different" sounds like a nice argument to get more than what you need. In reality, I don't often see anyone changing their tasks mid-lifecycle and then have a sudden requirements change. It hasn't happend for me at all.
The same parallel can be made with cargo capacity of a vehicle; technically every Lamborghini ever is a stupid choice because you can't move your furniture with it. Or every Fiat Panda is stupid because you can't drive at 200 Km/h speeds. Make it more absurd: every boat is stupid because it doesn't fly like a rocket.
This is probably my biggest complaint with macOS.
I'm still shocked by the inability to disable inertia scrolling on a usb mouse, as well as acceleration.
These things were removed on purpose.
If you use a macbook with usb devices, there are all sorts of oddities. It's annoying. I can use it, it's just not for a power user.
For me, power user means "configure it", shape it to maximize your workflow speed.
MacOS is just not that. With linux there is a lot more flexibility.
I don't care too much about eye-candiness, I want my UI to be incredibly fast. MacOS has those damn animations when switching spaces that is annoying to say the least (I switch spaces very often)
For all the negative replies I wonder how many people actually have used Keychain. It's very good. I tried onePassword, google's, and firefox's, but none integrates as well among all my devices and with zero effort in the OS as keychain. Anyone leaving negative comments about keychain, at least point out that you've given it the ol' try in the Apple Ecosystem.
Or we have different assumptions and you have devices outside the Apple ecosystem or don't want to be tied to it therefore the benefits I see to keychain don't apply to you.
Also the WWDC yesterday has some cool new features about bad passwords (But I think other password managers already do this).
(Not affiliated with it, just a pretty happy (free) user!)
Does it actually work the same as Mobile Safari's interface on iOS? Automatically presenting options both for creating and autofilling passwords on webpages? (If it can do that in iOS Firefox, I might consider it...I've been using desktop Firefox for a while because of some bizarre idiosyncratic issues I have with Safari on my laptop, that I can't reproduce anywhere else, but are extremely irritating where they do happen...)
A big benefit of switching to Bitwarden is that you can also get it up and running on non-Apple machines. Kicking myself for not doing it earlier.
Just be aware that bulk exporting your passwords from iCloud Keychain is non-trivial.
If you create a Netflix account on your desktop in Chrome, it can generate a password for you, or auto-grab whatever password you typed in, and will autofill that password when you open the Netflix app on your phone. (Based on a built-in association between "netflix.com" and the Netflix android app.)
Super handy feature when you get a new mobile device and need to get logged in to all your apps for the first time.
Yeah, keep telling yourself that. They are a part of PRISM.
Internal quality control for macOS and built-in Apple apps has fallen off a cliff in recent times. Catalina has been disastrous in that regard. Having used Macs since OS/X Leopard, I’m not jumping to Big Sur. I switched from Windows because it was a buggy, hot mess. I’m not hanging around and risking my work as macOS quality control keeps falling. Apple’s developers have lost my trust.
As long as your standard is low, sure. Scanning with Image Capture results in needlessly large files and it has turned to be quite a piece of crap. Moved my scanning pipeline back to Linux, and now, I've got text recognition for free !
AFAIK, windows and linux still don't have suctions tool that can backup/restore without pain.
Less cohesion, competing GUI libs/configuration systems/etc, still not 100% GPU support, the Wayland situation, almost non-existent multimedia (video, audio) options (not to mention support for peripherals like audio interfaces), and being at the worse than Apple's arbitrary mercy of your favourite desktop environment programmers (e.g. abrupt transitions and bizarro decisions in Gnome and KDE major versions).
I'm a graphics engineer so I'm not sure I understand this argument. When I talk about weaker and worse-performing hardware options and drivers on Mac, I'm referring _mainly_ to GPUs.
> Non-existent multimedia (video, audio) options
This... is simply not true (?)
Also, regarding Gnome/KDE, I dislike both and don't use either. Something like i3 with zero-to-know configuration is already pretty great out of the box.
And when I talk about weaker support for GPUs on Linux, I mean the Wayland / compositor situation, getting 3D to work without planning what specific model of laptop to buy to use with what specific distro, and so on...
>This... is simply not true (?)
This is simply true. It can't run Cubase, Logic (ok, that's expected), Pro Tools, Studio One, Reason, FL Studio, Maschine, and tons of other things besides. Your best bet is some niche players like BitWig and Traktion, or crossing your fingers with Wine... And let's not get started for drivers for professional audio interfaces and peripherals...
Same for NLEs...
> (seriously, safari is the new internet explorer).
I'm not sure what this is supposed to mean. Safari is holding back the web because despite its monopoly on users it doesn't support new technologies? That's not true at all. Especially because it's got such a small market share, it should be up to them to deliver what they feel makes the most sense, and let the market decide the rest (though sadly the market seems to keep using Chrome, which is terrible).
Honest question though: as a Safari user almost exclusively for the past 15 years, what I missing out on that I could have experienced if Safari had supported WebGL2? I don't know that I've encountered any sites or use cases where Safari was holding me back in that regard, but I'm curious to know what was out there that I couldn't see before.
> Itunes, Mail, etc, none of the software shipping with MacOS is actually any good. I
It's better than the open source stuff on Linux. While it has gotten much better over the years, it's still not good enough for normal people like my parents or my sisters who are not techies. That's just reality for now at least. It's pointless as to whether or not you agree with it.
The other benefits of Mac hardware (MagSafe for example) have all been removed leaving not-a-lot.
Our team, without any policy dictating daily development habits, has coalesced around Linux/Android development with iOS testing before release.
If it is a good react native app you won't know it is RN.
If you notice it, it is by definition poorly done. There are some things (e.g. specific animations, UI elements) that RN just isn't great at, the best way to avoid making a janky app is to avoid even trying to do those things.
Interestingly our team has gone the opposite way (again, no official policy). The deciding factor in our case has been the relative performance of Xcode/iOS Simulator vs Android Studio + Android Emulator. Android Studio on it's own slows my computer down more than Xcode compiling AND the iOS Simulator running at the same time. The Android emulator is all but unusable on my MacBook.
I agree there's a lot of painful bits around signing/release on iOS though.
Even with root, you can't touch /usr/bin. It took me way too long to try to figure out what was going on, and all the fixes were hacks.
If there really was an issue there then you either need to file a bug and have them mainline a fix, or yes hack/shim a fix on top for your needs. Perhaps leveraging PATH precedence.
Questioning the use case and insisting that one doesn't actually _want_ to do a thing instead of allowing the user to control their own system is the quintessential Apple experience.
However most people, including me, and I'd venture most engineers too, would rather have a hardened system.
Funny, that's a quote I hear a lot in Linux Desktop ecosystems as well. I think it is just the nature of people so accustomed to a certain way of thinking that any other use case that comes along is automatically considered to be doing it wrong.
If Linux is going to get attacked for not having a single GUI, it would be nice if it weren't also attacked for having a repressive GUI monoculture. /s
But more importantly, it's the question whose laptop it is. I continue to think that the tools I build go to /usr/bin, because that's the way I like it. Apple is telling me I'm liking it wrong.
As for filing a bug with Apple - good one. Every single Apple dev considers radr:// a black hole, and the chance of getting a fix from Apple because of bugs filed (vs. Apple wants to fix it anyways) is slim to none.
Overall, Macs are more and more machines that want to prevent shooting yourself in the foot, at the price of less flexibility and access. This is a good choice for some, it's not a good choice for me. (And many other people who like hacking their machines)
Yeah, but there's a certain expectation that the tool that you have installed in /usr/bin is a certain version. There's a reason why tools like Homebrew generally do not overwrite built-in tools.
If you just replaced /usr/bin/python with Python 3, you'd probably break all kinds of things.
macOS used to be "Unix, but with a great GUI". It is turning into "iOS, but with a few command line tools".
Because it's my computer that I paid for with my hard earned cash, and I want to.
How far we have fallen.
Yes, you can. If you disagree with Apple's System Integrity Protection, you can take two whole minutes of your life (if even up to that, considering how quickly OSes boot these days) to turn it off permanently and mount the entire drive as read-writeable. For some strange reason, though, I tend to encounter far more people complaining about it than actually taking the steps needed to fix the "problem".
Just install it with macports and you can use all the tools by prefixing it with a g - gls for ls, gdd for dd etc. Some of these tools are newer versions than on macOS and hence improved (as GPL3 prohibits Apple from including newer versions).
It was annoying that I couldn't do the basic things I expect to be able to do on a machine as root
Apple should roll their own package management tool or officially support homebrew.
Macports did it right, but "brew ..." is just a cooler catchier name.
It litterally comes down to that. The more fun name.
When someone is new and confused and overwhelmed, a slight difference like that, or the design of the website, or the charisma of the people talking on forum posts, is the tipping factor in which of the 11 possible things they try.
And as long as the first thing they tried worked, they keep doing it and quickly gain a rewarding feeling of accomplishment and confidense doing that thing.
At that point this is their home and their comfort. They aren't bothering with anything else when it doesn't seem to be any different. They conclude they chose wisely the first time (which is a good feeling that anyone can always simply decide to reward themself with, ie confirmation bias).
So, no one should have ever used homebrew. Homebrew "won" anyway, but not by being the better-architected more technically correct system with wiser engineers.
I forgive all the users for not realizing that the directions they are terrible and broken. There is no excuse for the developers writing that system.
So, your problem with homebrew, I say, the problem is homebrew not Apple.
I can't speak for anyone else, but I switched from MacPorts to Homebrew, after having previously switched from Fink to MacPorts many years ago despite not only being told but having actually experienced that Homebrew was slower and more fragile. And the reason was not "the more fun name." The reason was that both MacPorts and Fink were incredibly slow at expanding and updating their ports tree. Stuff that I knew from my Linux and FreeBSD days, or cool stuff that I read about on weblogs or in places like Slashdot and HN, was nearly always right there in a current or new-current version in Homebrew. That was rarely true of MacPorts: often it wasn't there at all, and if it was, it was often several minor or even a full major version behind.
This may have changed since then -- I hope it has, since I suspect it's been over a decade at this point -- but IIRC, the last straw was trying to install the terminal version of a program (either Emacs or Vim, I think) that also had a GUI version available and, after going to make coffee and coming back to the computer, discovering MacPorts (a) didn't care that I had explicitly asked for the terminal version, it was going to install the GUI version for me anyway, and (b) because of MacPorts' aggressive "never depend on the system version of anything no way no how", it was building a new version of the entire bleeping XFree86 from source for me. Yes, I understand why MacPorts takes that "no system dependencies" approach; yes, I get that was probably a badly-specified port file. But I ripped it out and never looked back.
Well, I take that back. I think I looked back once, about four years ago, to see if everything that I had installed under Homebrew on a machine could be installed with MacPorts, because maybe it was time to give it another chance. The answer was no, everything could not be installed. So the answer was no, it was not time to go back.
I don't care that Homebrew has a fun name. I care that it has the stuff that I want kept reasonably current with upstream versions. Homebrew has, as far as I can tell, gotten a lot better over the years at not being fragile. It's still no speed demon, but the truth is that I don't actually run it that often. If a more native macOS package for a given piece of software is available, I'll install that in preference.
I just plugged an extra ssd into my my old thinkpad yesterday. My boot volume is mSata. The hard drive hole was empty, and I added it with one screw. Now I have a vm storage pool. It now has more storage than my gf's mac, and it cost 1/6th as much.
I tried something similar with her computer yesterday. Plugged in a usb SATA ssd. ext4 isn't supported. alt-tab is broken. Using the touchpad for everything is mandatory because hotkeys are counter revolutionary I guess? GNU userland is there but like half of it is broken or fake, like fs/mtab aren’t real? The console clipboard doesn’t seem to work. All the apps are already running instances without a window or something? Is this supposed to be like a phone?
I just want it to fullscreen tmux and leave me alone, but I keep having to search the internet for the cutesy name of the GUI thing that is the only way to do <trivial task>. I could format the disk and put the data back after, but I'm now afraid file io in python will be even more fun than I am already having, so I go find another computer.
I keep trying to learn how to use a mac, but it's always so much harder than just using any other computer that is around that I just can't seem to make time. I find the UI willfully contrarian, and more alien than anything I have every used. I'm not trying to be an OS fanboy here, I know lot's of smart devs who say they like it. I want to "get it" but I just don't "get it". Discover-ability seems poor, especially as a user friendly tool for beginners, but the UI and locked down software eco-system seems so limiting for devs. Like, who is this for?
My gf told to stop trying to think and just click on things… She’s a modeler/dev and she hates her mac, but her employer won’t let her use linux anymore. None of this feels like a dev or “power user” experience. It feels like using Windows XP because the boss made me.
Is this like that thing Neal Stephenson said about the difference between "easy to learn" and "easy to use"? Like if I learn to use it properly I will start having "ah ha!" moments and become more productive?
EDIT: I really don't mean to be flippant! OP asked why people find it limiting. They seemed like they know what they're doing, and might be a dev, and lot's of devs use mac.
OP called it "the best of Unix combined with customer focus", which sounds great to me! I just don't understand what went wrong...
As for how the experience was for me, in terms of developer and power user, everything mac has from unix seemed a bit half way. It certainly surprised me repeatedly, finding some versions of core utilities to be older and not supporting flags here and there. But this was fine. Homebrew felt sluggish. But this too was fine. Then comes iOS development, and oh lord what a shit show. This was not fun, and the release process and signing, and all that.
I've reached the point in my professional life where I want things to work. I don't wish to struggle with technical limitations caused by politics and marketing decisions. Development on OSX feels that way. You can't virtualize the OS, and can't emulate an iPhone. You have to go through way too many hoops to do what should be simple.
So, I honestly, and for no inflammatory motivation, don't see why people like to work OSX. I understand that you wish to do so if you are limited by software and tools that restrict your choice, making it either that or Windows. And I dare say Windows is, in total, even worse.
For usability and ease of use, as an OS, for use with free software and otherwise software that exists for the platform, it's in my honest and in an effort to not be too biased or fanboy-y (again, I wouldn't push my fanboyism on my grandmother), simply easier. For a power user and again not limited by language or tech, there is no competition. None. I'd chose to work for a different company if it meant I didn't have to develop on OSX or Windows. Life is too short.
Linux changes expectations. Everything is possible, there should be package for that, deconstruct, throw away, replace, and most of the time someone already did it.
So much this. When I depend on proprietary stuff, the care and feeding of licenses and serial numbers and all that stuff never seems to end, and it's shackled to one computer or host name or MAC address or something like that, which changes faster than calvinball.
It's not even about money. Who has time for all that busywork?
If mac platform was freely redistributable and Linux was proprietary, I would switch to mac in a heartbeat even if I hated using it, just to protect my time from doing all that IT stuff :p
Sure I can't disassemble my laptop and change the hardware. I have no desire at all to do that. I literally can think of no reason I personally would want to do this, although I imagine there are a tiny fraction of users who would and you may be among them.
Everyone has a device fail them. Then you have the choice of changing it or repairing it, repair is almost always a better option if you're at all technically competent.
I can't imagine my SSD breaking down or my RAM failing and having to change the entire mainboard when I could just take 45 minutes and fix it. It's literally absurd to me.
SSD's don't fail the way they used to, RAM requirements don't change as much as it used to and while batteries do die, at this time I've seen more people switch machines before any of that happens.
The last time I wanted or needed to replace memory or storage in a laptop must have been 7 years ago, and I don't see anyone else doing that either. Even desktop upgrades are somewhat dead at this point, except for a GPU upgrade if someone is a gamer, or storage upgrade if people still use direct attached storage and didn't get enough when they initially got the machine.
Just because a scenario that hits a small percentage of DIY users is getting hard doesn't mean it's bad for everyone else as well.
I think the percentage of users that would buy a new, 1500$+ machine instead of a small 100$ upgrade is much lower than one expects living in a bit of an echo chamber. There's a lot of people that get burned by outrageous repair prices or that buy new computers when they could simply upgrade a part.
Ask yourself; if you've never felt the need to upgrade a laptop or to replace a break, then why really would you upgrade?
Beats me. You will often see comments in these threads about people (like me) using nearly decade-old Macbook Pros to this day.
I stopped replacing laptops every 3 (or less) years after I stopped buying budget non-Apple laptops.
I'm on Linux, and a laptop I can take apart and replace bits of, so I can see myself using this one for a few years more.
My Dell XPS 15 is suffering from Windows Rot after only two years. When I get a free afternoon I'm going to have to back it all up and clean it out, hopefully avoid reinstalling from scratch. For all the negativity I've had towards OS X, at least it's better than that.
I personally tend to keep my most recent device and the new device at the same time for about 2 years because of vendor lag (happens a lot with those classical software vendors that take months between software releases) and because I need the ability to compare between versions of both hardware and software. The newest device gets promoted daily driver (usually many benefits there, as they often are lighter yet more powerful). In general it means 2.25-ish devices per 10 years.
Other hardware, like SBCs tend to rotate out slower, but generic x86 platforms rotate out faster because if a critical component fails the labour for finding parts and replacing them is too much vs. buying up to date replacements. Luckily, due to lower usage of those machines they last longer, so technically the Apple hardware made the non-Apple hardware have a longer lifecycle in my case.
There is some irony in there as I do provide board-level repairs and the machines I work on for other people are ones I'd never personally invest in.
And by degradation I don't mean the existing device degrades per se, but the degradation of productivity on the current device vs. new device.
At the same time, the way work is done has changed a lot for me and the people around me: heavy workloads are almost never done on a laptop anymore.
What does this mean? Do you mean your current device starts working poorly (but that would mean "existing device degrades per se", which you ruled out) or that the existence of a new device automatically makes you perceive your device as being of "degraded productivity"?
Say you change your workload model there might be ~20% improvements between bare metal, virtual machines and containers. If you simulate a part of infrastructure using containers you may not need more RAM, but more CPU would be nice. But when you then want to do a lot of recording/capturing and process that, RAM gets more important. Just upgrading the RAM wouldn't help much because without a CPU to generate the data you might as well offload the whole thing.
This is definitely an echo-chamber/bubble point of view. The vast majority of users out there don't even know or care what AVX is, don't use external GPUs, and don't know or care about the difference between Thunderbolt 2 and 3.
If you personally need these things and want to buy a new machine every few years, then that's great, you should do that. But there are a ton of people who would benefit from an easily-repairable, easily-upgradable (RAM, storage) machine that end up dropping $1500 every few years instead of the couple hundred they could instead spend for a reasonable upgrade.
While I bet that there are a lot of people that do want to modify their systems, they are such a minority that it's not very logical for a large multinational to invest in that to the detriment of other goals. It might simply mean that you are not the target audience for their product(s).
Some other manufacturers/brands have the same, while others do a mixed portfolio to cater to smaller groups as well. We also have large manufacturers that cater to the classical enterprises which still run on the old idea that you need a fleet of identical machines and then swap out parts all day long, so machines that have facilities for that exist. Most notably Lenovo, HP and Dell do that.
I whinge about the non-upgradability of our devices as much as the next guy, but it's not like people are throwing their 3 year old macbooks in the trash can.
Work one, sure. But I haven’t upgraded my Mac in years.
Most people get close to 8 years before they actually want a new one when their work doesn't change much -- if your requirements don't change and your tool fits, no reason to change.
My opinion is that people switch machines more often precisely because they're not upgradable/repairable. I think if most people could go to their local computer repair shop to get RAM/storage/etc. replaced or upgraded, we'd have a lot less electronics waste, and consumer computers would last a lot longer.
At this point, software CPU and GPU needs aren't increasing all that much year by year, unless you're a gamer or do HPC. For the rest of everyone -- the majority -- a CPU and GPU made 5-10 years ago is still just fine for what they want to do. These old machines just need more RAM and sometimes more storage.
I get that it's hard to fit sockets for replaceable modules when everyone wants a super thin laptop, but overall this is a source of so much economic and physical waste.
Most computer repair shops around here have disappeared because it's no longer the way people buy and use their computers. At least not enough people to keep those shops running.
Sometimes people upgrade because they feel like it. Doesn't always have to do with the numbers.
However, I think Apple (and other laptop manufacturers) cross the line when they ship soldered-in batteries with their laptops. Batteries are consumable components that are guaranteed to degrade over time. Unlike RAM, which, assuming Apple sources it from quality vendors, you can expect to last for many years, laptop batteries will definitely begin to degrade within a few years, and will, over enough time, render the laptop unusable. Apple is producing $1500+ laptops with a non-replaceable component that will, without a doubt, eventually break down. Your options at that point are to buy a new laptop, or to send it to Apple so they can charge an arm and a leg to replace the entire top case instead of just the battery (as that'd be impossible). You should buy a new laptop when you want to, not because you "might as well" without the option of battery replacement, and with Apple's "repair" option costing a quarter of the price of a new one.
Though to a lesser extent, it also irks me that the SSDs are soldered in, not necessarily because I'd want the ability to upgrade them post-purchase, but mainly because your valuable data lives there, and it will become trapped on a dead board if something goes wrong with an unrelated component. Even if failures like this are rare, the threat of data loss while using a MacBook would still concern me more than while using a laptop with a replaceable SSD. I believe Apple has a data recovery procedure, but you'd still have to send your laptop to them and cross your fingers in the event it dies, whereas if the SSD wasn't soldered in, you'd have the option of manually recovering your data, or taking it to a repair shop where they could do it for you.
I would love to see a MacBook Pro with, at the very least, a replaceable battery (but especially with a replaceable SSD as well).
I know that a lot of people are very emotional about their RAM and SSDs and it can be a PITA, but it doesn't affect as many people as you'd think. At least that's what the available data shows.
Though, to be honest, I don't bother unless there's a particular application which isn't available on Linux.
The latter works fine for me.
Final Cut Pro
Sounds like non-iOS software developers that are stuck on macOS can just hop right off without any problem whatsoever...
Being able to swap out RAM and storage and battery from factory-provided can extend the lifetime (and even increase the performance capacity) of a given machine at nothing more than the cost of the new parts. And Apple used to make it pretty easy to do this, and enough people did that there was a subeconomy of vendors selling for this purpose, although of course there were full-service options for people who literally couldn't imagine doing it themselves.
This also meant that downtime wasn't controlled by Apple Store service availability (literally had a drive fail on me once, had my bootable backup, swapped it into the machine, good to go).
And on a number of occasions I've done more complex replacements, a DVD drive here, a keyboard there. I can see why most people wouldn't want to do those, they're a giant pain, but having the option can be empowering.
What's actually hard to think of, if one is thinking, is what Apple has gained in return for this. I can kinda squint and see that irregular (and therefore, perhaps, less efficiently swappable) battery shapes have some credible advantages, but the rest of the marginal ounce-gains and dimensional-golf scores belong to a category of diminishing returns.
It isn't just the cost of new parts. It's also the cost of those parts being larger in order to accommodate user-accessibility and replacement. When everything is surface-mounted on the PCB it can be made a lot smaller. This allows Apple to shrink the laptop as well as allocate more space for the battery.
No one is asking for phones where we can swap out the RAM and storage. Why do we want this from laptops? I'm typing this on a 2020 Air which I configured with 16GB of RAM and a 1TB SSD. I really don't anticipate a need to upgrade either of these things before the machine reaches end of life anyway.
Why shouldn't we? I would gladly sacrifice a few mm of thickness for a machine where I could upgrade the RAM and storage (and replace the battery) every few years. Not only would that save me money, but it's much more sustainable from a manufacturing and waste perspective.
I just finally threw out my old (sadly broken) 12" G4 PowerBook, and marveled that it had a removable battery (you don't have to disassemble it; you just flip a latch on the external bit, and the battery slides out). And I remembered that at some point I'd done a trivially-easy disassembly at one point to upgrade the HDD. I got that laptop secondhand, when it was already a couple years old, and it lasted me a good five years, and probably would have lasted longer had my then-girlfriend not dropped it on concrete, which somehow fried the drive controller.
You’re part of a very small niche. The tinkerer who doesn’t want to jump to Linux. This is a tiny number of people, sadly, just as the number of people who want to tinker with their cars is very small. Most people just want something that works and is very convenient. They prefer to leave the service to service people.
Not saying you'd have to. Battery technology has improved since then; we can make lighter batteries that last longer. Hard drives have been replaced by small PCBs with a few chips on them; again, much lighter. Further miniaturization of the internals means a smaller chassis which means less material; again, much lighter.
It's be perfectly possible to build a laptop with similar dimensions and weight as the current crop of laptops, but allow for easier battery replacement, and even RAM/storage replacement. I'm not saying it'd be as simple to swap as it was in 2005, but it'd at least be easier/possible and not involve special tools.
Apple seems to implicitly claim that their anti-repair/anti-upgrade stance is due "delighting customers" with smaller, lighter hardware, but I suspect it's mostly driven by their desire to lock down their hardware against tampering, and keep people on the upgrade/purchase treadmill every few years.
> You’re part of a very small niche. The tinkerer who doesn’t want to jump to Linux.
I actually do run Linux (previously on Mac hardware, but I finally gave up on it last year), so that's not the niche I'm a part of.
But that's not really the point; I'm not speaking for myself, I'm speaking for average users. I see a lot of posts here claiming that average users "don't think about" this or "don't care about" that, and I posit that the average user doesn't care about an extra few millimeters of thickness or an extra few tenths of a pound of weight. Especially if it doubles or triples the useful life of the device through a RAM/storage upgrade and easy/cheap battery replacement every few years.
> They prefer to leave the service to service people.
That's fine too, but Apple is actively against allowing a robust, price-competitive field of independent repair shops. And even if they weren't, soldering in the RAM chips and NVMe drives means the only thing those repair shops could do would be to swap out batteries and main boards, rather than do cheap, targeted upgrades.
Yes, modern Apple laptops gave up the “brick battery that slips out with a latch.” But in place of that we have Terraced internal batteries that can fill any volume
I’m kind of curious to see how usage goes for AWS Graviton 2 instances in the future as Apple laptops transition and there’s less friction to deploy along ARM based tool chains and artifacts.
While that is true, how many users of any kind of device actually considers that an option these days?
Throwing in words like 'right' and 'unjust' helps in movements and call to action etc, but in this context it's about someone turning on their device, doing some work, and turning it off again.
All of those points are fine, but they are points separate from a user using a device.
If you want it for environmental reasons, say that. If you want it for freedom, say that. Both are valid reasons but don't have anything to do with someone using their machine, unless that 'usage' mains doing stuff with the insides. And before someone jumps in to tell that that is the main activity: great, but most people buy those machines to do work that doesn't entail opening it up.
Freedom 'to do what you want' has not had much to do with what works in the world of selling to the masses when it comes to user experience (and yes that is ironic).
Regarding 'using' and 'their' and 'machine': I meant an activity based scenario which is what most people see themselves doing. They mostly have no concept of ownership, actions or device semantics. The reason is simple: it's not needed to be a user and it's not needed to get the same results as other people in your group have. And for a lot of people all that matters is doing the same as the rest of the local group.
Edit number 3: you can apparently select your default protocol handler in iOS 14. I bet there are some double digit Google users that want that for Chrome, and a few sub-1% users that want FireFox. But that's still not freedom because they all use the same renderer and JS engine. On the other hand: it's unlikely that people care about that, and the people that do are not likely to run iOS at all. Luckily, you don't have to use iOS. Or a phone. So some aspects of freedom are unchanged.
All true but irrelevant. The problem is that artificial lack of choice prevents competition and innovation. Microsoft from the 90s is here again. And users don’t care, again.
I do not count another UI as a new browser. It should be different under the hood.
Can you imagine any general user ever repeating that? I can't. As those users woud say: nobody cares.
I personally would care, but I'm not the main marketed target for Apple. I also care more about getting stuff done than what engine my browser uses.
This is why healthy competition and anti-trust actually matters. It's not because your average user cares about the details, it's because they're being impacted even when they don't know it.
This subthread has a fairly trivial example of this, but when you get into right-to-repair it becomes even more important. Your average consumer could be spending a lot less money to have a much better experience. The fact that they don't know about that is part of the problem; lack of knowledge doesn't make the issue irrelevant.
And we don't even need to get into the environmental arguments around reducing waste to consider this aspect of the issue.
Sure, if you keep a population ignorant, they won't care about the pettiness of their lives, as they don't know about freedom.
You should watch Louis Rossmann who is doing repairs, how purposefully Apple makes any repairs hard.
If you personnaly care a little bit about the environment there is a pretty evident reason why being able to swap a single piece of hardware is better than replacing the whole unit because everything is soldered together.
Right to repair would also mean that once your device goes out of guarantee you would have more options to get your device repaired for cheaper.
* Apple charges +$800 for 64GB of RAM, which on Amazon would cost you around $300-350.
* Apple costs $1600 for 4TB of SSD, which on Amazon would cost you around $450-600.
* Soldering the SSD to the motherboard is an enormously horrible engineering decision for both servicing, backups, and data recovery.
And why do you think they are exclusive to each other? I have a, I kid you not, nearly 15+ year old washing machine that I am still using. I had to replace 2 parts, due to wear and tear, in the last 5 years and it still runs as good. I am glad that I could repair it for a fraction of the cost than dump it and buy a new one.
And a good sell-more-laptops business decision?
Hmm yes I do: https://twitter.com/RDKLInc/status/1275100376350384132
Have you also never had anyone you know spilling liquid on their keyboard after the warranty expired? Or dropped it?
So in the end yes, I am pretty sure that there are literal millions of devices being thrown away because repairing them is too expensive.
Edit: I'll also add that in 6 years using a mac, Apple replaced my macbook entirely twice for some issue that you would hope could be fixed without throwing everything away. So, sure, the customer is happy because they got a brand new device. But personnally I would rather stick with my 1 year old device and reduce environmental impact.
Regarding spilling on keyboards: I don't know anyone personally who does that, in my local area people don't eat or drink next to their devices. I do have plenty of people I don't personally know that did ruin their computers that way and I have replaced plenty of Apple keyboards and entire topcases to solve it. Works fine as a side-business.
Repairing is not the same as upgrading, and recycling is not the same as throwing away, and neither is reusing. The difference is important, as the solution changes with the case.
Overall, this would be more fitting in a general discussion on environmental impact and not as much one about software, the hardware it runs on, and human behaviour. If you want to solve a big problem you need to generify instead of brand it as far as I know.
To dive in a little deeper: to 'fix' this, people need to spend less time on how beautiful their stuff is, or how cool they look, and go a more utilitarian route. But since the American economy is built around the opposite and the western European one has the tendency to follow it in some aspects that's unlikely to happy any time soon. Changing people is hard, it's also the best way to solve or fix or change anything.
Sorry, I thought this was illustrative enough to make my point, but if we have to be pendantic then here you go the exact same thing but with Macbooks:
> in my local area people don't eat or drink next to their devices
Interesting local area you have there, at my office and all the offices i was in the past everyone has a cup of coffee right next to their laptop.
Again, two MacBooks and not millions, as explained in the twitter thread. Again, not defective but locked by the previous owner. Not even remotely related to 'right to repair'.
As an owner, I want to be sure that when I lock a device, it's actually unusable to anyone else. So this is good (for me).
You want your data to be inaccessible. Not providing a way to factory-reset a perfectly functioning device (say, by replacing a modular storage) is just irresponsible.
I am incredibly disappointed with every update from Apple.
I bought my MacBook Pro in 2011.
The most basic 13.3" machine that was available for sale.
320GB 5400RPM HDD, i5, 4GB RAM, Intel graphics.
I like backups. The optical drive had to go, I put it in an external USB enclosure. It was replaced by the HDD and in its place I put a Samsung SSD. The HDD contains my data archives and a macOS installation image.
Whenever macOS would become cluttered, I would wipe the SSD and enjoy a fresh macOS installation. Whenever a new macOS version would be released, I would update the installation image.
This enabled me to work on the go with no fear of data loss and no network dependencies.
SSDs are basically long lasting consumables. I had to painlessly replace the drive twice, however each time I got a faster and more robust unit with enhanced capacity. Sure I can carry an external drive with me. I have no desire at all to do that.
I like the mini jack, so that I can plug my favorite headphones. The audio-out port also doubles as an optical-out. Sure I can carry a couple of dongles with me. I have no desire at all to do that.
Sure I could have bought the top of the line 8GB RAM model back in 2011. I had no desire at all to do that. I could just buy 16GB of RAM and perform an upgrade Apple claims is not supported.
The battery lasts long enough. While coding, I get drained before the battery does. I can let the battery age with no fear of it bulging because it's contained in a shell. Replacement is easy.
I totally get where you are coming from, and agree with you that "a tiny fraction of users" want access to the hardware.
However 100% would want access to the hardware when their 16" MacBook Pro SSD goes.
Don't professionals require machines that cater for their needs? If the current Apple offered choices like the ones I mentioned above, would you not pay a premium to have them?
I certainly would. And for that reason I still rely the same 2011 MacBook Pro and not the latest MacBook GoodEnough.
Apple has been slowly whittling away at the MacOS value proposition for developers.
Yes, I think the success of the iDevices have made them change focus. They are no longer interested in making devices for the power user. They now make more money selling iDevices, especially recurrent revenue. And so the ignorant consumers are now their target market.
I think on my 2016 MacBook at home, the only time I get bottlenecked on performance is CPU while playing EU4. Oddly, all the powerusery stuff I do could probably be done equally well on a 10 year old device if it wasn't for USB-C.
You are happy with the sluggishness in Catalina from security checks over the network before running applications?
> Sure I can't disassemble my laptop and change the hardware. I have no desire at all to do that. I literally can think of no reason I personally would want to do this
The cost of repair is higher because independent shops cannot repair the boards anymore or recover data when your storage, memory, and everything else is soldered directly to the board.
I don't doubt that you enjoy the platform, but I would be hard pressed to say that I would be happy with everything from any tech company.
That's a lie. Every power user tries to get the most out of their machine, and so desires an upgradeable system. No power user likes to throw away a system because s/he can't replace its battery or upgrade its ram or hdd / ssd. At some point, everyone likes to upgrade their existing system without spending a gazillion bucks for it.
Yes, it is true that with phones and tablets, corporates like Apple and Samsung have managed to convince the general mass that not being able to upgrade your phone or tablet is a NORMAL thing, and soldered RAM and soldered SSD are the only option, but we power users know that's a lie that Apple and Samsung (and others) would like us to believe.
And that is why their marketing is heavily based to push the exact message you are parroting:
> I imagine there are a tiny fraction of users who would and you may be among them.
I can bet that even if someone is not a poweruser, and an ordinary consumer / user, they too would appreciate the option to upgrade their phones / systems and reuse it. That is why we have efforts like the PuzzlePhone in europe.
Have you ever heard of the No True Scotsman fallacy?
The only current option is to drop $1500 on a the new model. It's not only more expensive, but wasteful.
I guess there are a lot of younger folks around here nowadays, but it was perfectly normal in the 90s and early 00s for regular non-technical home users to get out a screwdriver, open up their desktop PC, and replace the RAM, add a new/larger HDD, add a CD/DVD drive, even swap out or add a graphics or sound card. And there was a perfectly functioning computer shop industry that would do that for you on the cheap if you weren't comfortable.
In the past 15-20 years manufacturers have gotten us into the cycle of buy->discard->replace to the point that people don't even know it was ever different.
Power users want freedom and control of their own machines and hardware. The moment that Apple started soldering parts into place, I started looking elsewhere because it was a signal of the thought process.
Very true! The first thing I did after buying a Mac Mini was to upgrade the RAM and change the HDD to an SSD, and even added an extra HDD. The Apple options for the same were nearly triple the cost. Only an idiot would believe that having soldered CPUs, soldered RAM and now soldered SSDs is a good thing. It is a good thing only for Apple, and Samsung et. al bottomlines.
And with ARM processors, you can be sure you'll get a PC with locked bootloaders that won't allow you to even install other operating systems. And the offered OS will also be a dumbed down system like ios that will only allow you install from a locked app store. All the while leeching off your personal data to the cloud.
I too have started exploring other alternatives.
I know people like to have the feeling of control (do you really control your laptop if you can't replace your EC, CSME, AGESA, SSD FW, NIC FW, VBIOS) and the concept of shuffling parts around, but the need for that (at least in my circle) has pretty much faded.
Yes, people exist that do that or want that or really need to save every cent possible, but to many others that is not even interesting as a thought exercise.
So I referred to mid-lifecycle because those people that initially don't try to go on the cheap usually only did any part swapping because the machine was good enough and swapping machines was too much of a hassle: upgrades made sense.
I've upgraded hard drives and RAM on very single Mac I've ever owned, except the last Mac I purchased, a 2015 MBP, where it wasn't an option.
It's looking like this one will be literally the last Mac I purchase. Apple's practice of charging premium, bespoke prices for commodity hardware was insulting, but at least there was a reasonable workaround before they started the soldering crap. Soldering things down is just a brazen money grab.
Can't believe I'm looking at going back to Windows but here we are.
I've personally upgraded the hard drive and replaced the battery twice on my old 2011 MacBook Pro.
I did upgrade someone's 2015 in 2018 with a bigger SSD because they needed 2TB and it wasn't available at that time (they were on 512) and they used it as a local buffer for footage before offloading. Because they only have USB3 or TB3 drives and no TB2 drives, the multiple transfers on that MBP are too time consuming to do in the field.
Ditto for the storage; I have 512GB in there now (probably wasn't the max when I bought it, but seemed sufficient and was a good price trade off), but I wouldn't mind swapping it out for a 1TB part. Actually, this bit might be possible to do myself; I should look into it.
I'm not saying soldering things down is ideal, but it's only a problem because people perceive an option that was there but now isn't as 'bad faith' or 'taking away' something. Imagine the riots if we used to be able to change the CPU in a laptop. (in theory we 'used' to be able to but in practise that never worked well because the microcode didn't fit in the firmware and the cooling solution only takes a small amount of TDP variance - popping in a faster CPU meant it overheated so quickly it was useless)
Would it be cool if all devices were composable? Sure. Is it something users care about? Not really.
I'd like it if Apple found a way to allow swapping of components within the design envelope constraints they create d for themselves, but it also won't have an impact on what I'd buy.
Yes, they do. Sometimes a technology like SSD itself gives a great boost in performance equivalent to a new costlier system.
That's literally why I switched to Linux in 2018, after a decade on OS X.
I haven't used Linux as a main driver in a decade but I do use it very frequently and I am intrigued by all the many ways people online seem to be able to break it. I have never once run into a problem with apt.
It really makes me wonder how much time people who complain about Linux have really put into learning it. I'm not thinking you need to read a book on it, but after a few months of just using it I would think one would be reasonably proficient?
I understand the claims of needing proprietary software. I, too, with that the Office suite was a first class citizen. That said, it absolutely baffles me the amount of weird ways people have issues.
Raises hand my home server (running Debian) is currently in a weird state where a whole bunch of packages won't install or update because some library's stuck on a version they don't like. The version they want isn't available. I'm pretty sure I haven't even added any nonstandard repos to it (all it really runs is docker, so anything odd comes in as a docker container rather than a system package—this in part because I don't trust Linux package managers not to break my system when installing non-critical software due to long experience of same happening over and over, go figure). I'm not sure what I did to cause it and it's still running Docker fine so my incentive to spend an hour tracking down and fixing whatever-it-is has been zero—I'm just glad it's not a system I depend on for anything serious or need to really do anything with.
I spent a few years running Gentoo as my main machine in the 00s. On a laptop. Got suspend-to-disk working, even. I had to manually install Grub on this same Debian home server when I built it last yer, in a chroot to my newly-installed system, because the installer kept failing to do it (my process to fix it was very ordinary and encountered no problems, AFAIK, so I still don't know why the Debian installer couldn't do it, I've never seen it fail quite that way before, but it did so repeatedly so I had to give up and open up the engine, as it were). So I'm not entirely clueless.
Nonetheless I don't really feel confident using a Linux desktop I can't snapshot for emergency rollbacks or rebuild in a few minutes from a script, because damned if weird problems don't crop up when you upgrade. Or don't upgrade. Or reboot, forgetting you'd installed (through the blessed package manager!) a new kernel and now it can't read your encrypted root anymore so is unbootable and now you get to spend some time figuring that out. And so on.
I feel no such anxiety on macOS. Not that that it never breaks, but it's rare enough I don't worry about it. FWIW I'm back on desktop Linux now due to Macs' insane prices, but I suspect I made a mistake and the time I've already lost would have made spending an extra $500 on a significantly worse-spec'd machine well worth it. It's a little here, a little there, but it adds up.
In that same time, Linux really caught up and diverged such that it offers features in its kernel, like cgroups and kernel namespaces, that macOS doesn't match, leaving me to feel as if the old FreeBSD roots of macOS are less of a perk and more of a kludge in 2020.
Today, I run Linux on my MacBook because it is a modern Unix clone that looks good and works well with all of my development tools. Native Docker is a treat.
1. My MacBook Pro 2017 had some weird problems due to voltage of the CPU. Somehow they misplaced the laptop but they did a full top case replacement eventually. It was weird when someone came out of the store with handcuffs while I was in the Apple store. With this Laptop I wasn't disappointed by the keyboard or the Touch Bar and never experienced reliability issues. I sold it due to frustration but I shouldn't have done it.
2. My AirPods have been replaced twice due to buzzing. This could have been me not cleaning them and sometimes I still experience buzzing sometimes but cleaning them seemed to work. I got both earbuds replaced twice
3. iCloud got messed up on my MacBook 2016 when I installed a beta version of Catalina and that was my fault had to go on support and they never got back to me about an engineer looking at my iCloud.
4. My MacBook Pro Retina (around 2012) had a zebra pattern on the monitor. I took it to the Apple store twice and they just gave me a new laptop.
With all of these issues over the years Windows is much more pain to even use. On my Gaming computer Wifi still can take minutes to start. I had to upgrade it to Wifi 6 and that may have fixed it with an external wifi.
My current Laptop (MacBook Pro 16) has good thermal management now. Plugging into a monitor does increase the thermals to 60 C . Sometimes I turn off turbo boost. Other than that having a laptop with 64 GB of Ram, a 5500m graphics card is pretty nice especially with a 2.3 GHz 8-Core Intel Core i9. Somehow I actually was able to refine a few iterations of GPT-2 on the CPU but that was by mistake when I was cleaning data.
I like the Touch Bar because it lets me not change to using the trackpad sometimes due to RSI. I also have an external trackball mouse.
Wifi and bluetooth open with no issue. System is really responsive even compared to my Windows machine. Screens have always been amazing and color accurate. Emacs runs great also. Terminal.app keeps on getting better. UI looks like it's going to be cleaned up in MacOS 11. Having ARM is going to be very exciting and Metal is even more exciting and might be the only thing that will start to compete with NVIDIA.
My workloads are getting to the point that a Colab Pro Notebook sounds like a good idea or setting up my Gaming / Deep learning computer to use RDP into it using Windows 10. But by now I would probably even want to install Windows Server 2019 due to the fact that the NVIDIA drivers would support the card and when I run video games I don't have to manually kill background services that subtract 30 FPS from my video games making them unplayable on a Titan RTX.
Also note that at the time he was creating a NeXT, a Unix company.
I actually 10 minutes ago had a look at how much a 32 GB Mac costs nowadays -- and it's like 2x more than a 32 GB PC. That's less than some years ago, then a Mac was more like 3x more expensive, if I remember correctly. Looks like the trend Jobs mentions in the video, about lower and lower ASP. (ASP = average selling price, right.)
The other reason I switched was because all of my developer friends knew nothing about Windows, and getting stuck on a maybe Windows problem meant I had to fend for myself when I needed help the most.
I'm not a fan of the new OSX, guess mobile is more prevalent and maybe apple are hoping the iPhone and ipad can be gateway drugs into macbooks / desktop by being more of the same.
Wish more efforts like the Puzzle phone in europe would bear some fruit.
What do you mean by that? (Personally, I can't think of anything I can't do now that I could do five or ten years ago on my Mac. Yet a lot of things are better.)