Hacker News new | past | comments | ask | show | jobs | submit login
Leaving the Apple ecosystem behind (h2x.sh)
511 points by recvonline on Sept 22, 2021 | hide | past | favorite | 487 comments



I'm confused. From his success list:

I basically went back 5 or 10 years and replaced every "modern" technology solution. I pay now way more than I did with iCloud, but I am back in control. I am more productive.

- Listening to Music takes 3 clicks and just a few seconds Wired headphones never run out of battery and have superior audio quality

- I can take real photos wiht high quality instead of relying on ever newer iPhone models for thousands of dollars which will always have lesser quality than a mirrorless camera for the same price

- I have fun again discovering bands and artists on Bandcamp instead of mind numbing listening to Apple Music playlists

- Coding via neovim on a terminal and just being on my keyboard navigating not only tmux and co, but also my OS is way more productive and faster

Which of these can I not do on a Mac?


The measurement of success in this case doesn't seem to be that he's gained new abilities, but rather the poster figured out how to gain them without Apple.

Most of these seem like pyrric victories to me, as the Apple versions rose to success with all of these as competition by providing a better experience. I certainly wouldn't want to carry a mirrorless camera everywhere, nor deal with Bandcamp.


> the Apple versions rose to success with all of these as competition by providing a better experience

At one time, Apple clearly was a better experience.

Now, with features removed and new "privacy and security" features added, I'm not sure they are a great experience. For example, my Macbook reminds me of Windows Vista, except for worse, every time there's a system update and I have to reboot to permission the camera for a web conference.

I've learned to dread OSX updates because instead of adding new useful stuff, it seems like we just move things around, change out the icons and add some more intrusive "touch the fingerprint reader" authentications... plus I have to re-permission half of the apps and hardware just to do my job. Then there's the whole reboot, unlock, install, reboot, re-lock cycle. It's seriously worse than Windows Vista. Anyone remember the I'm a Mac/I'm a PC commercials?


I have never needed to reboot because of permissions, basically just had to click “allow” a few times, so this stuff doesn’t really bother me. It’s a very slight inconvenience.

Big Sur performance on older hardware has been a disaster though. Even on a $2500 MBP 15” from 2017. On M1 though it’s excellent.


Every update I need two reboot twice to get the google drive system extension to load, but that’s the only issue I’ve had like that.

However big sur runs alright on my 2015 MacBook!


Having written system extensions (both kext and DriverKit), that seems more like an issue with the extension itself.


That may, but Apple broke a formerly working API. If Google can't figure it out, that's a signal.


I mean... no?

If you're referring to the Kext/DriverKit changeover, this is just false - Kexts still work in Big Sur, they're just trying to slowly shift the ecosystem to DriverKit. Nothing has truly changed there yet.

If you're referring to a broken thing within Kexts/IOKit itself, that would be a pretty severe bug that would get attention within Apple. I feel confident saying this as I've reported bugs like this and they get appropriate priority levels.

Lastly, if you're writing (signed) driver code, you generally have access to resources within Apple to get answers to questions. I'm not even a large company and it was relatively easy to get in touch with those teams - and this stands in contrast to other teams within Apple.


> I've learned to dread OSX updates because instead of adding new useful stuff, it seems like we just move things around

To be fair, that also describes Windows. As best I can tell, every release since 7 has primarily focused on renaming and adding indirection to the ways you get to the same old control panels.

And I haven't ever had to reboot to allow a camera. I have several, ranging from a microscope to an SLR that also is my webcam.


> As best I can tell, every release since 7 has primarily focused on renaming and adding indirection to the ways you get to the same old control panels.

I thought this was the UI museum feature where you can time travel back to Windows NT4 one layer of indirection at a time.


I still don't know how they managed to make the settings control panel replacement so bad in windows 10.

For example, you go to the proxy setup, the pac URL location is a tiny text box no matter how big the window, and if you have restrictions on your system you cant view the full address or copy it.


Some things travel back to windows 1.0 - windows 10 UI.


Sounds like something is wrong with your TCC.db file. When did you last check permissions?

Also, in case this helps:

https://cipherlog.blogspot.com/2020/12/macos-microphonecamer...


Thanks, but those instructions did not solve the problem for me at all.


> nor deal with Bandcamp.

Curious what's wrong with dealing with Bandcamp? They seem just about the best place to buy music online. Majority of the money goes to the actual artist and you have a good choice of formats.


> Majority of the money goes to the actual artist and you have a good choice of formats.

Agreed. Bandcamp does this right by offering V0 compressed MP3s, 320kbps MP3s, AIFF, FLAC, ALAC and I think even Ogg Vorbis.


And, for the past 18 months, they've had a day where they waive all fees so bands that can't play live anymore can stand a chance of making some money.


Bandcamp is great, and there is something really wrong with Apple music in Firefox on windows. Its very slow and its seems to do authentication _after_ loading the rest of the page and often fails.


Glad it wasn't just me. How is it that slow with gigabit connection?

In fact, I really like their catalog and match algorithm but I always thought apple music being outside of its own fence feels gross.


The android app is good, though I can't compare it to the iOS version. On windows, itunes works okay-ish albeit slow. On Linux, the web app is the only option and it sucks way bad.


> I certainly wouldn't want to carry a mirrorless camera everywhere, nor deal with Bandcamp.

Bandcamp recommendations and artist/album similarity ratings are top notch, and there are many clients for the platform.

I personally can't use Spotify or Apple Music after taking advantage of Bandcamp.

Bandcamp also only takes a 10% - 15% fee for digital purchases, and 10% of purchases for physical items, which is an incredible deal for artists compared to what they make from streaming services like Apple Music or Spotify.


For Bandcamp maybe regarding playing and library, but not regarding discovery.


Depends on the mirrorless camera, doesn't it? The smaller ones are hardly bigger than larger smartphone, and they do offer way better picture quality.


They're still much bigger than an iPhone. Sure, if you walk around with some 13" tablet that doubles as a phone, the point may be moot, but a "regular" iPhone is much smaller than even compact cameras.

I can stick my iPhone 7 in my jeans pocket. My Olympus Pen-F (one of the "smaller" mirrorless cameras) with a small prime lens needs a pretty big pocket only some of my larger coats have. Plus it's heavy enough to pull on said coat and make it uncomfortable.

Yeah, image quality is pretty bad on the iPhone compared to the Olympus. But when I go out and about and don't want to have a bulky thing hanging around my neck / forearm or carry an extra bag, the Olympus' image quality is exactly 0. The iPhone beats that hands down, even at night.

Now don't get me wrong, I love my pen-f, and it's an incredible improvement over the DSLR I used to haul around before. But iPhones are getting pretty good for my needs now.


The best camera is the one you have with you.


Not really. I have so many shitty Razr and smartphone videos that I wish I had just enjoyed the moment instead of wasting time capturing something that has the quality of a gameboy camera.

Now I am thankful my dad endured the pain and captured a lot of stuff on the JVC shoulder camera. At least the quality is pretty good even for 80's/90's.

We don't capture events like we used to anyway. No one holds there camera up for more than a minute. So we're left with all these short videos. I like watching the really long stuff my dad shot.


Couldn't agree more!


Not really (on size), what mirrorless can you fit into your pocket? The smallest ones still are 3-4" thick because of the lens.



While not opting out of most of the convenience the Apple ecosystem gives me, I did buy a GRIII and carry it along everywhere I go - together with an otherwise capable iPhone. The difference in optics and sensor really does show in print or when cropping.


I have one of these and yes, I can cram it into my pocket with some effort, but it isn't very testicle friendly.


Your trousers? No. Your coat? Definitely. Unless you go for full-format ones with bigger lenses. In which case your priorities most likely aren't carrying your camera gear in your pocket.


I’m not wearing a coat all year long, and almost never on vacation. The latter situation is the most important one where I need a less bulky solution.


Fuji X-E4 + 27mm f/2.8 at 2" thick is quite pocketable.


Sincere question: Can mirrorless cameras match or do better than the result of the multiple exposure stuff that phones can do? (E.g. Apple’s “Deep Fusion” feature)


Yes, easily. Modern DSLR sensors have a higher native dynamic range, so you can get more out of a photo without using multiple exposures in the first place.

Also, most DSLRs these days have some sort of multi-exposure functionality built-in. Sometimes it's as simple as 'bracketing' (so you get multiple phones you have to merge together in post), but some fancier ones offer HDR functionality in-camera, so nothing to do in post.


I have to travel a ways to specialty shops to buy reasonably good headphones with wires. It makes me mad every time I need to buy some new headphones for my computer.


I carried a google pixel 3 which took excellent pictures.

That said, I now carry a mirrorless camera everywhere. The pictures are WAY higher quality and I have more control over the experience and the final output. Don't get a mirrorless if thats not important to you.

when I want to just take a quick selfie, I still have my phone. The iPhone is great at taking pictures but its never going to be a match for a mirrorless wihch has a enormous sized sensor compared to anything you'll get on a phone.


"Listening to Music takes 3 clicks and just a few seconds Wired headphones never run out of battery and have superior audio quality ..."

This item is the reason I am leaving the iphone and trying an unlocked/stock android device.

My music collection is a directory tree that I have curated and organized since 1996.

The correct way to deal with this is to move this directory tree onto my phone (either via network transfer or attaching a USB filesystem) and then browse those files with a music player app.

Anyone familiar with iDevices knows that every piece of the simple, standard workflow I just described is totally impossible.

Instead, you have to manually build playlists inside of itunes while "importing" your music (and storing two copies of it) and then transfer those playlists (one by one) to the idevice and ... it's just insane.

It is a workflow built for people that impulse buy a track here and there ...


> The correct way to deal with this is to move this directory tree onto my phone (either via network transfer or attaching a USB filesystem) and then browse those files with a music player app. Anyone familiar with iDevices knows that every piece of the simple, standard workflow I just described is totally impossible.

On the contrary, any number of apps support precisely this.

- https://readdle.com/documents (not just music)

- https://www.everappz.com/evermusic

- https://brushedtype.co/doppler/

- https://apps.apple.com/us/app/sony-music-center/id724406878 (if you use Sony headphones)

- http://www.videolan.org/vlc/download-ios.html

- https://apps.apple.com/us/app/mrmc-touch/id1062986407

There's no point in listing all of them.

Having ripped some 20,000 CD tracks a couple decades ago, I use several such apps.

As a user of such apps, I'd argue with "correct" though, given Apple Match with iCloud One combo and last year's update supporting high resolution / lossless.

Over time, I have come to use those apps less than Apple Match, which mirrors my rips using tracks from Apple's library where they have them, or uploads mine where they don't, giving me more seamless access across all devices, spoken access from Siri on HomePods, etc. Match was a debacle at launch, is now almost never wrong on even the most obscure tracks.


I've had good luck with flacbox https://www.everappz.com/flacbox

It supports FLAC obviously but also Opus. For my phone, I've ripped my FLAC files to Opus and can carry my entire music collection wherever I go. I used their import tool to build the same folder structure that I have on my NAS. I have iTunes installed on my Windows machine (no Macs at home) but I try to avoid it if at all possible.


This app uploads your identifiers and activity to the developer without consent.

There's a growing trend these days of making every single player app into spyware, and it's sad. I won't use reader apps or player apps that read or play local files that are going to transmit my activity off device for no reason that benefits me.


From this Linux (formerly Windows from a looong time ago) guy this looks dreadfully silly and annoying. Files and folders, y'all.


All those players work from files and folders. That was the point.


Sure, I'd say I was also commenting on the idea that "Files and Folders" as anything other than an extremely obvious default (and even perhaps requires help on a forum) is again, terrible and silly.


> Instead, you have to manually build playlists inside of itunes while "importing" your music (and storing two copies of it) and then transfer those playlists (one by one) to the idevice and ... it's just insane.

This is not completely true. You can store your iTunes collection wherever you like, organized however you like. You don't have to duplicate anything, although it's true that the default is for it to "import" it.

You can also create an "all my music" playlist which you can sync with the iDevice.

I used to have this setup with my music collection on Google Drive (because it didn't fit on my MBP's internal drive) and synced some of it to my iPhone. It worked well enough. The issue was more that all the music couldn't fit on the phone, so I had to pick and choose anyway.

The real gotcha is that iTunes didn't support flac, so I had to convert everything to m4a.


"You can also create an "all my music" playlist which you can sync with the iDevice."

Yes, but then how do you deal with that enormous "all my music" playlist once it is in the iDevice ?

You can't browse by directory. You can't organize or display based on filename. So I guess I could parse all of the collection and transpose the artist/title/album out of the filename into mp3 metadata and then I would have a ... 30,000 track playlist ?

Again, all of this makes perfect sense if you're impulse buying a track here and a track there and if there is some way to move that "collection" to a new device every 2-3 years.

It's just not for me.


Well, I don't know how you organize your music "by directory", so maybe you can't reproduce what you do.

In my case, I organize it by artist / album / track number - track title; or by compilations. I then search for the album or the artist. I never have just random single tracks, so a directory is an album, which I have in Apple Music.

But I guess that you can't have any kind of organization you want, which is something that folders could give you.

> So I guess I could parse all of the collection and transpose the artist/title/album out of the filename into mp3 metadata and then I would have a ... 30,000 track playlist ?

Well, in the case of a meticulously managed collection, I'd expect the files to have correct metadata. Again, if this isn't the case, and you rely on file name / location, yeah, you're gonna have a bad time.

Just for the record, I've never bought any track off iTunes. All my music is ripped from CDs.


"Well, in the case of a meticulously managed collection, I'd expect the files to have correct metadata."

WAV files don't have metadata like mp3 files (typically) do.

I'm not saying I copy the uncompressed wav collection to my phone (~700 GB) but I am saying that my original metadata schema has all of the metadata in the filename:

Last, First - AlbumName - 01 - SongName - 3m25s.mp3

... and yes I could parse and reencode all of these populating their mp3 tags with these fields but, man ... what a load of work just because iTunes can sort by 50 different attributes just not filename:

https://www.tech-recipes.com/wp-content/uploads/2012/11/itun...

... just look at all of those fine grained, fancy ways to sort by ... but the most basic attribute of all the filename is missing.


> WAV files don't have metadata like mp3 files (typically) do.

I feel it's my civic duty here to repeat: they do, in fact they even support ID3 tags, but it's up to the software developer as to whether they support them or not.

WAV tagging has grown in support in the past few years, but yeah, iTunes certainly doesn't support them.


> ... just look at all of those fine grained, fancy ways to sort by ... but the most basic attribute of all the filename is missing.

That’s a screenshot from a decade-old iTunes. And it includes the option to sort by name.

I’ve just checked with the most recent version of Apple Music, and the wav filename appears in the title field and you can sort by it.


Did you try any third party music apps?


Browse by artist, album, etc?


> Anyone familiar with iDevices knows that every piece of the simple, standard workflow I just described is totally impossible.

No, it is not impossible. It is very easy at least as Linux user.

There is a project for that, called as iFuse https://github.com/libimobiledevice/ifuse

It basically allows you to mount the Media chroot filesystem and the app specific sandboxed filesystems without a jailbreak.


The old iTunes Media Library .xml format isn't too complicated - I wrote a few small tools to generate m3u8 playlists, and another to convert .m3u8 playlists into itunes library xml. Then I just import that straight into the music app.

Shouldn't take more than an afternoon of a reasonably proficient developer's time. I agree that it's not great that you should need to do this kind of thing, but it's less effort than switching to android ;)


> Anyone familiar with iDevices knows that every piece of the simple, standard workflow I just described is totally impossible.

I believe VOX Music Player allows you to upload your music without using iTunes - although it is using their cloud sync instead. Flacbox also seems to let you download and play local files from Dropbox, OneDrive, Box, SMB servers, DLNA servers, ...

It doesn't seem "totally impossible"?


My music collection is a giant pile of files (~19k tracks, according to iTunes) that I have been curating for a similar length of time, but I let iTunes do all the grind of organizing them on disc for me. My purchasing and listening is pretty much entirely on an album basis.

iTunes uploads it to Apple's cloud, since I'm paying for iTunes Match; my iPhone and iPads pull it down. It is very easy to search for a single album on the iOS players; they also make it very easy to find your most recent acquisitions, with an automatically-populated "Recently Added" playlist.

~/Music/iTunes/iTunes Media/Music looks almost exactly like the directory structure you're probably meticulously maintaining by hand (lots of Artist Name/Album Name/01 Track Name.m4a), but adding new stuff to it is a simple matter of dropping a directory full of properly-tagged files onto iTunes. Then I delete the original files after iTunes has copied it into its directory.

It breaks down if 90% of your collection is a bunch of badly-tagged files you downloaded off of KazAa, but if your collection is a mix of stuff you ripped from CDs back in the nineties and made sure were tagged properly, and stuff you've bought that the musician/store tagged properly, it works pretty much seamlessly. Except for this one weird glitch where sometimes iTunes on my Mac decides that I have both the copy of a track I bought off of the Apple store, and one in the cloud, and thus plays every song off an album twice. I've kinda quit buying stuff from Apple because of this; I'll go to Bandcamp first.

What is the attraction of meticulously maintaining a directory structure that a computer can very, very easily maintain based on the metadata stored in the files? Why are you so married to explicitly browsing a duplicate of this filesystem on other player devices?


When you drop all of your stuff into itunes, won't it automatically reencode everything to either aac, possibly leading to perceivable audio errors or to alc, bloating the file sizes to a ridiculous degree?


Nope. I've got WAVs and MP3s sitting around my library that I know I have dropped in there as recently as a few months ago.


Most interesting, I should take note of that. thank you.


> Anyone familiar with iDevices knows that every piece of the simple, standard workflow I just described is totally impossible.

It is so bad that it makes me wonder who at Apple thought any of it was a good idea. Its not even 'bad from a certain perspective' bad but totally FUBAR.

I simply stopped trying to use my phone for playing music and it is the #1 reason why my partner wants to move away from iOS


Uh, did something change? As recently as a year ago I just dragged files into iTunes (or Music or whatever they call it now, but this was definitely after the split) then selected which artists I wanted to sync, and I only had to do that step because I didn't want them all on there. I didn't have to do anything with playlists.

But then, my metadata's in pretty good shape, so it barely even matters how my music files are stored. One big flat directory, carefully named folders, not that important.

To be fair, I guess I did have to convert the Flac to m4a. Drag to converter program, convert a ton in one go, drag those into iTunes. So that's one more step.


I think that manually organizing your own files is not really compatible with a large music library (I have around 2 TB of music) and, from my experience, was too much of a barrier for most people anyway. I currently use Swinsian to organize files (and various metadata-fixing tools) and Plex + Plexamp to listen to them. Mac mini server, iPhone. Works great. This is just to say that one person's "correct" is another person's cumbersome and unintuitive.


Using a file system as a metadata storage system is pretty dumb - especially when music files have ID3 tags built into their formats. There are tools that will help you fill out the metadata in the tags based on your file system layout. Once you do that you can use the tags to slice/dice. Smart Playlists are very powerful.

Or you can use one of the many other apps others linked to if you really want to stick to the whole filesystem thing.

I quickly got away from trying to organize stuff in the file system when I got my Personal Jukebox 100 (PJB100) - literally the first hard drive based MP3 player out there in the mid 90's. It heavily relied on MP3 tags so I developed tag discipline early on - and never looked back. Tags are WAY more flexible than folder structures. I couldn't care less how files are stored in the file system.

This conversation always amuses me - we don't complain the computer tracks all the parts of our files in a directory while we have no control over the layout of the files on disk - mainly because that's a level of minutiae better left to automation. For me it's a similar things with my music files. As long as my tag information is accurate (and since it's the first thing I do when I add something to my collection, it is) I can manage my music collection however I want irrespective of where the file is.


Bootcamp on a MAC is real hit or miss when it comes to Linux. Honestly never seen a modern Macbook Pro run linux natively without major compromise. Hell the window's drivers are bad enough, Apple doesn't really want users doing anything other than OSX.

I will say they have no problem booting up a VM, and everything he is doing can be done in a VM with hardware accelerated graphics. So Full screened feels just like the real thing, as long as you are not playing modern games. Boot into windows for that.

That being said there are Windows Laptops that will give a Macbook Pro a run for the money, but when it comes to the $1k price range the Macbook Air is hard to beat. I like the ARM chips for basic users, but for a Power user the ARM chips are a step backwards. Unless you want to run ARM based Linux. With my many years of raspberry pi use, ARM has gone a long way on the Linux front. But not sure if i'd want to rock that as my main.

At the same time you can do all these things on Windows as well.... Really more of a personal preference. I have limited experience with Linux for Windows Subsystem 2, but it worked damn well when I used it.


I hope things have gotten better with Linux and drivers. I'll admit I've been out of the game for about 5 years now.

> Honestly never seen a modern Macbook Pro run linux natively without major compromise.

When I was using Linux as my main OS, I was having this problem with all laptops. Desktops usually had many less issues, but were occasionally less than perfect.

For example, Lenovo, at the time, was hailed to be great for Linux compatibility. Well, for some reason, regardless of the distro I was using, I was more or less given an ultimatum -- I could have a screen with adjustable brightness or a consistent network card driver, but not both. Editing whatever file I found on various forms to "fix" the backlight issue (was permanently stuck at 100% brightness) would inevitably cause some sort issue with where my Internet connection speed would drop from around 1gbps to mbps eventually to kbps the longer laptop was awake.

It was an issue I never found a solution to, and the oddest part was that this was never an issue if I booted the OS of a live USB. Only when the OS was installed would the issue arise.

Various laptops I owned / used for work had their own issues with Linux too. I eventually just settled on using Linux in virtual machines / servers and never looked back. So, I definitely agree with your point about VMs.

I'd consider going back, but I really cannot/do not want want to sacrifice too much time getting distracted with making the OS bend to my will when I could use that time to be actually getting things done.


> Bootcamp on a MAC is real hit or miss when it comes to Linux.

Dual booting on any hardware is nuts if you ask me, pick an OS and use VMs. If you really have to have a native OS but hate it so much you wont use it the rest of the time stick it on a different machine.


I think it's a "philosophical standpoint"

>I realised that my life while using Apple products is controlled by Product Managers/Owners who want to get a raise, rather than by technology people who share the same passion as me. And I wanted to change that.


Shout out to the unambitous product manager / owner who doesnt want a raise at Fujifilm for making such nice cameras. There is no chance that they tweaked the XT-2 or XT-3 to compete with the also excelent Sony products in order to impress the guys back at HQ.

Obviously they are "technology people who share the same passion as me" and are browsing throught HN.


>Obviously they are "technology people who share the same passion as me" and are browsing throught HN.

Except they actually are? If you read their dev blogs, it's insane how much love actually goes into building the colour magic that goes into the cameras. Trully decades upon decades of love labour.

Instead at Apple camp they deliberately hinder UX for end-users if it means Apple gets to make some more profit this way.


I'm struggling to see your point. Fujifilm does not have the same incentives as Apple.


The author agrees with you that they do not have the same incentives and have products not directed by Product Managers/Owners who want to get a raise and are apparently technology people who share the same passion as them.

I just see two companies, that want to sell stuff for more than it costs them to produce, but what would I know.


Many of us probably know senior folks at Apple.

While this is purely anecdotal, I've observed a higher proportion of "mission oriented" product and even I.T. leads at Apple than at the other brands he adopted in this divorce. (He's not using Framework laptop yet.)


Which of these can I not do on a Mac?

You can do all of them when you use a Mac, but Apple will do a lot to try to persuade you not to. That means you'll spend a lot of mental energy fighting with Apple's very effective p̶s̶y̶c̶h̶o̶l̶o̶g̶y̶ marketing department, and often you won't win. That's fine. You don't really lose anything, but every so often you'll think to yourself "I listen to the same bands all the time" or "I wish this phone had a bigger camera sensor." and you'll regret putting so much of your life, and money, in Apple's pockets.


Honestly this is by far the biggest problem I currently have with newer versions of MacOSX.

I would have hopped by now that integration with dropbox, Sharepoint/Onedrive, Google Drive would have gotten better. But they are mostly barebones.

Granted Google Drive is barebones on nearly every platform.

I'm a big 365 user, and that is by far the best experience so far in terms of fluid use between devices and apps. Especially on windows, where even sharing a file doesn't require a Web browser popup. But oh man the admin side of 365 is overly complex, and easy to see why people use gsuite.

iCloud Sucks and has always sucked. Unless that is you are 100% apple ecosystem. Integration into my onsite server is easy with 365, google, and dropbox. Icloud is super clunky and really thinking about moving the Wife to something other than iCloud.


What sorts of integration are Dropbox, Onedrive, etc unable to do?


The third dot point here makes absolutely no sense to me.

Apple music and Bandcamp are essentially the same thing, I personally use Spotify and I don't have to listen to "mind numbing" music. You can choose to let the applications play music for you, or you can choose your own music.

Spotify 10 years ago was exactly how I discovered old/new bands and artists that were not mainstream.

I don't see this persons logic in that regard at all.


The author is stating that not all music sees distribution on Spotify et al, instead it is distributed via Bandcamp or other sources. A lot of older material isn't available through Spotify et al due to licensing issues, or labels that have gone bust.


It's their tone, there is a difference between "a lot of older material isn't available on Spotify" vs saying that Spotify has mind numbing music.

Spotify to my knowledge and experience has an enormous amount of older music. Enough to make it the main source of music for my parents. I think this guy needs to realise that they don't need to listen to the curated playlists...


> Listening to Music takes 3 clicks and just a few seconds Wired headphones never run out of battery and have superior audio quality

"Hey Siri, shuffle <playlist name>" - no clicks involved.

As for wired/wireless headphones. I wasn't aware that running Arch was a requirement for using Wired headphones, but then again, i moved to wireless headphones a decade or so ago, and have changed my habits to allow me to charge them while i sleep, as well as checking the product specifications for any headphones i buy to see if they actually match my expected usage pattern. But then again, i'm weird.


Sure, you could get a mac and then opt out of/remove all things apple, but then why get a mac in the first place?

I don't think that is confusing at all, just a matter of degree between leaving all services of apple or additionally leaving their OS as well.


The post can be distilled down to "I had a Mac and thought it made me special, but now I don't feel special so I use something else and think that makes me special".

Which...I guess? Millions (billions) of people use all sorts of stuff. Good for them.

The "I'm leaving [some platform]" posts are always extremely low value pablum for a subset.


As somoeone who did the same thing recently, just to MS and Google (I still have a hard time abandoning YoutUbe and Youtube Music along with my years-long build playlist and preferences), I can relate. Because what the OP described is basically the recognition that

- dedicated hardware does a better job than general purpose one (cameras vs. smartphones)

- the smartphone ecosystem (regardless of brand) is becoming more closed, and monitored, every year

- there are enough alternatives out there, even if those mean to give up some "conveniences" we thought we won over the last years (IMHO, most of the perceived "inconveniences" that come from these alternatives are more due to lock-in and dark-patterns from FANG/MS and phone OEMs)


There are many reasons why someone might choose some hardware or platforms over other hardware or platforms. The paradox, though, is that if someone believes it's all or nothing -- that they have to tie their ego with a platform -- their judgment is likely suspect to begin with. When people do the "I'm leaving 𝑥" type declarations, it almost always comes from an unhealthy place.

Somehow I've managed to own an SLR alongside every system I've used for the past decade+ (though it sits unused to a much greater degree given how vastly improved modern smartphones are...). I have a Lenovo Windows laptop, a Windows 10 gaming PC, an Intel MBP, an M1 Mac Mini, an iPhone, an iPad, though I've owned a number of Android tablets and smartphones (including every Nexus device) before. Every server system I've deployed in the past decade has been to Linux.

No Apple stormtroopers ever busted down my doors and demanded compliance. Never did I feel the need to wave a flag or commit to a tribe, because why would I? The notion is self-sabotaging.


> why would I?

In a word "integration".

Can you use an ipad as a second screen/wacom cintiq? On mac it is built in. On windows you need extra software and then it is still ducttaped on.

Linux desktop and android phone? Then you can reply to sms, share the clipboard, automute music when receiving a call,... (kdeconnect. MS and Apple have some decent copies). What if you have an iphone instead? Well, bad luck.

All these things might be "minor" or "only by default", but they matter very much to many people.


> dedicated hardware does a better job than general purpose one (cameras vs. smartphones)

Hardly headline news - a $1,700 camera with >$1,000 lenses is better than a $1,000 phone with a camera attached. If anything the gap is closer than it has ever been. Irrespective, there is litterally nothing stopping someone from having an iPhone and a dedicated camera, or if there is, then I (and practically everyone I know) are breaking some sort of rule somewhere dictated by the overlords at Apple HQ that thy shall not use a camera.


Also, as any photographer knows, the best camera is the one you have on you.


>a $1,700 camera with >$1,000 lenses is better than a $1,000 phone with a camera attached.

The original assertion was same price, so a $1,000 kit body + lens is better than a $1,000 phone and camera.

Which doesn't change anything about how inane a point it is.


> "dedicated hardware does a better job than general purpose one (cameras vs. smartphones)"

I think that depends on what you mean by "better job." For example, I have both a Nikon D-800 and an iPhone. The Nikon takes amazing pictures and the iPhone fits in my pocket. Most of the time, the "better job" I want is a camera that fits in my pocket. On occasion, the "better job" I want is a camera that takes amazing pictures.

I would never abandon either my camera or my phone because my "better job" frequently changes.


Yeah, this is just the nerd equivalent of being a hipster. It's not even as though any of the changes are any better, they're just different. Which of course means you can then write a blog post in a cool monospace font with your cool domain name proclaiming your nerd coolness to the masses.


Not only that, but then you'd also write it in a way that low-key suggests other people probably also want to do the same thing to be just as cool.

Problem is that there are so many different people, and so many of them don't even know how to operate a device that isn't completely pre-prepped for the lowest bar to entry that they can never be as cool. /s

I guess this is the kind of data point that people might use in reinforcing their bias for a choice they are about to make. If you take two brands, platforms, systems etc. and just search for switchers between those you'll find the exact result you'll like.

Going from Debian to Arch? Amazing! Debian sucks for reasons X and Y and Arch is much cooler. But switching from Arch to Debian? Hah, those Arch losers are missing out on A, B and C so they are so uncool! Heck, you can make this even smaller. Using OpenSans as your Font? Boo! Use Fira Code, that makes your code so much better! Pick any microcosmos and you can find migrations in all directions.


This. This 100%. I don’t think my nerd cred is in any danger. I regularly write kernel code, bare metal code, assembly, etc. I know how to computer.

But the older I get, the more I find I’m just happier in the Apple ecosystem. I don’t want to fiddle with X11 settings or tons of dot files. I just want to open my computer and hack on the shit that interests me. There’s things about Apple that annoy me, but the same can be said for Android, Windows, any of the free *nixes, etc.

Every platform and ecosystem has trade offs and it’s fine to just use what you want to use. Personally I’ve got Apple laptop, phone and tablet, a Windows desktop for games, several RPis running various Linux distros, and a Proxmox server running VMs for infrastructure and tinkering. None of that makes my farts smell better than anyone else.

… except for those nasty Emacs users of course :P


I read this article and was immediately reminded of “that kid” in school who was a super fan of some obscure punk band, until some other kid at school claimed they liked them too. Then “that kid” would tell everybody how the now not obscure enough band “sucked”.


It reminds me of the early MUDs and some MMO guild bulletin boards where you'd periodically get these super melodramatic "I quit and this is why" posts, which would more often than not be followed by the person returning to the game not long after.


The longer and more impassioned the goodbye6ever post, the sooner you know the author will return.


If it's really long, it means they love the game but burned themselves out and don't want to admit it.


A long enought post and they will be back before they get to the end of their goodbye.


This still happens frequently. Even before the lawsuit against Blizzard, their WoW forums would basically have daily posts from people who felt the need to cry about things they didn't like about the game and tell everyone they're quitting.


Some people don't care about their computers making them feeling special and instead just have a set of goals and choose the computer hardware and software which best accomplishes those goals.


But those people I'd say wouldn't then go and write an article about it.


Can you imagine how many, very boring articles, there would be if everyone did write that article.

'I needed windows for some work stuff, so I went onto dell and bought the thing in the middle of the price range.'


On a mac, with say, 4 workspaces. What key combo can I hit to go to workspace 2?

Yes you can hit CTRL + left arrow or right arrow multiple times but... once one gets into a workflow that allows them to hit Super+2, it actually feels like a pain to do something else.


Control-2 if you enable it. Or 1, 3, or 4.


> I'm confused ... Which of these can I not do on a Mac?

You apparently didn't read the article properly. The author's major grouse is how he recognized slowly that Apple is very controlling about the user experience on its devices and how this is a huge limitation to do anything else "outside" of Apple's "thinking" of how a software or hardware should be used. And how Apple's product manager's have lost sight of what really adds value to the user experience, and their software and hardware choices of new Apple devices are now more dictated by their own greed / ambitions.

From the very examples you cited, the author's emphasis thus was on:

> ... but I am back in control. I am more productive.

> Wired headphones never run out of battery and have superior audio quality ...

> I can take real photos wiht high quality ...

> I have fun again discovering bands and artists on Bandcamp instead of mind numbing listening to Apple Music playlists

> ... but also my OS is way more productive and faster

So it's not just about what you can or cannot do on a Mac, but how the author has found a better way to do all this outside of Apple's limiting ecosystem, keeping in line with his new beliefs that Apple no longer cares about users like him. And I fully agree with him and share the exact feeling (I feel Apple says a "F*k you" to me everytime I want to maximise a Finder or Safari window because the "Apple way" is that you are only supposed to make them fullscreen or vertically maximise ...)


I’m actually joining the Apple ecosystem from Android/Windows for similar reasons. In particular, I wanted to use Logic instead of paying huge sums for new versions of Ableton; I wanted to use Pixelmator Pro instead of paying for Adobe subscription (although Affinity Photo is also available on windows); I wanted to use Time Machine instead of Google Drive; I especially wanted my passwords stored on my computer instead of locked behind my Google account at passwords.google.com (I was locked out of my Google account for a week once and it was temporarily life-ruining); and finally, I was so tired of trying to keep up with endless UI changes in Windows and Android. I’m not saying these were good reasons, but they were my reasons. One person’s trash is another person’s treasure I guess.


Google's ecosystem != Windows ecosystem.

There's FLStudio (or Bitwig, or Reaper) to match Apple's Logic, 1Password (or Bitlocker from what I hear) to match Google's passwords.google.com, and Backblaze+others to match Time Machine

Sure 'switching to apple' is a choice, but only one of many. Personally I am tired of using tech designed for and by managers and SWE's gunning for promotion and not personal empowerment or passion for personal computing

Besides, everyone needs to ditch Chrome. Firefox could use the love


Backblaze+others to match Time Machine

BB complements TM for me; it's super useful to have a local versioned backup that's updated every time I dock my laptop with the desk, and is also invisibly supplemented by a versioned backup of changes I've made that the OS maintains on unused space on the laptop's drive. Works even if I'm sitting at a cafe with all the radios turned off to get maximum life out of the battery.

Backblaze? Backblaze is there for in case my house burns down and I leave with nothing but the clothes on my back.


Bitwarden. Open source, independently auditable password manager. For something this important, it should be no other way.


On the other hand, having as many users as possible use any password manager at all is an immense challenge as-is. It really doesn't matter what gets them using one as long as they do. If an OS-native one happens to have the lowest friction, so be it.

For everyone else, sure, there might be 100 people in the world that will actually audit their open source password managers. But that isn't exactly moving the rest of the industry forward (be it from the engineering perspective or the user perspective).

In this case, (almost) perfect is the enemy of good.


Ah, shoot. It's too late for me to edit my original post but this is what I meant to mention. Bitlocker is Windows' FDE solution


> "Personally I am tired of using tech designed for and by managers and SWE's gunning for promotion and not personal empowerment or passion for personal computing"

Unless you have spoken to all managers and SWEs I doubt you can make such a blanket statement about the immense workforces at Microsoft, Google and Apple.

Sure, one could argue that Microsoft is profit-driven, Google might be marketshare-driven and Apple might be UX-driven, and all of those could be true (or at least true for a major subset of both the engineers and the targeted user group). But it would be a guess at best and not helpful to your personal situation or anyone else's.


How could you possibly interpret this new world of digital sharecropping ("cloud computing") as empowering? Where is the passion for 'personal computing' when nearly everything personal about it is delegated to the cloud?

Personal computing is a box under your desk or in your pocket beefy and smart enough to solve your problems without checking in with daddy warbucks and the mothership.

Nearly everything about modern cloud computing is the opposite... it disempowers individuals and creates an unhealthy relationship with things that are out of the users control. How many peoples' livelihoods have been messed up because FANG decided to lock them out for some reason or another? How many articles have been posted to HN and elsewhere begging for a FANG employee to come along and fix their problem because there is no other option?

The server in my closet or at my datacenter isn't going to lock me out because I pissed off Google.


There's literally nothing stoping you from working exactly the same way as before the "cloud" was a thing. You just ignore the cloud features.


In theory, yes, but in practice Microsoft Office favors its cloud service when you click "Open" or "Save". Didn't they break autosave to only work when a document is backed by the cloud?

In practice, I am continually having to disable cloud-related features (worded that I need them in the most patronizing way possible) every time Windows updates

In practice, common knowledge on desktop computing is atrophying because of an increasingly acute lack of knowledgeable users on the internet. As more and more of the young users who grew up in this gilded cage enter a level of expertise, their advice pollutes forums with cloud-backed solutions and knowledge of 'the old ways' becomes harder and harder to find.


There are dark UX patterns pushing users towards the cloud services and/or making the old way harder to use.


Why do i need 2 clicks on win to open the file save dialog ? If the program is behaving "like before" why do they change it every couple of months ?


>> How could you possibly interpret this new world of digital sharecropping ("cloud computing") as empowering?

Because it lets people - such as my older relatives - who wouldn't use a desktop and would have problems setting up and configuring one - especially for an "advanced" use case such as setting up your own private cloud.

As a computer instructor back home in west Africa, before smartphones I had students (especially older ones) who struggled with even using a mouse or keyboard. I've never had to teach a single one of them how to use their smartphone (at least for things like browsing the internet, taking and editing photos, making notes, etc). Providing things like photo editing on-device, without having to download and figure out a program like photoshop (or image magic for example) is a non-starter. Now they can edit, filter, share, make home movies- all without needing any help.

>> Where is the passion for 'personal computing' when nearly everything personal about it is delegated to the cloud?

I don't know what you mean by passion, but these people love their phones and the things it lets them do, with an intuitive UI (as well as a transferable one between Android and iOS, since they're about the same feature wise at the moment)

>> Personal computing is a box under your desk or in your pocket beefy and smart enough to solve your problems

This is ONE version of personal computing, and tbh an option only available to us "hackers".

>> without checking in with daddy warbucks and the mothership.

You don't need to check in. My country has crappy bandwidth and of course it's impossible to do things like pay for iCloud or gdrive without credit cards, so most use the phone without backups, and they're fine.

>> Nearly everything about modern cloud computing is the opposite... it disempowers individuals and creates an unhealthy relationship with things that are out of the users control.

How is allowing my students, older relatives, etc - who wouldn't be able to use a laptop or desktop machine - to go online and talk to their children abroad using apps like whatsapp - disempowering? I understand your point - I miss the days when we had computers to mess around with and learn Linux and coding on. Most of us on here came up that way, compared to kids now who don't have to see the underlying OS or tinker with it - they can just play Fortnite and watch YouTube.

But most people aren't interested in these things, or even care about machine models, OS versions etc. They buy phones because they want to communicate online and talk to their friends and family. That's it.

Before whatsapp the only way I could talk to my family was through international calls, which are prohibitively expensive, so they would happen about once a month. Once they got on whatsapp though we moved to being able to talk, share videos and pics of things happening at home, every single day. In addition a large part of getting rid of our dictator was people being able to use apps like whatsapp to share the latest information, even as he cracked down and only allowed propaganda to be published even in private newspapers.

This is a net positive however you look at it - it has brought far more people (especially in poorer countries) into the digital age. I find that very empowering in the sense of putting tools in the hands of more people, including barely literate ones (my country has something like a 40% illiteracy level rate) who use things like whatsapp voice notes to communicate.

>> How many peoples' livelihoods have been messed up because FANG decided to lock them out for some reason or another?

This is true but not really related to your larger point about computing devices and how they are being used now.

>> The server in my closet or at my datacenter isn't going to lock me out because I pissed off Google.

This really doesn't happen that much often considering we're talking billions of users.

We are on a "hacker" site where most people are skilled "tech people" so of course when things break for them they write blog posts and comments and we get to see and argue over those. Most people using these devices don't know or care about any of this, and have gmail accounts going back a decade that they've used without any issues.

In summary your version of computing (which is similar to mine) just isn't the universal version. I always find joy whenever I go back home and find taxi drivers using their phones to play music, people using YouTube to check instructional videos and post their own etc. When my grandmother died my uncle gathered all the photos, videos, etc that we have of her as a family on whatsapp, and made a nice video tribute of her life to share with us. Gathering the photos and videos took longer than making the movie itself, which was a few taps, and it was a very powerful moment emotionally esp for us family members outside the country. We were able to participate in remembering her life together in a way that international phone calls and a complex (to my uncle) photo editing app on a PC under a desk would ever have let us.

I find this very empowering, and it makes me happy how far we've come and how many more people we have brought across the digital divide. Let's not be myopic because those of us on here have the technical knowledge to disdain and even dump these platforms.

And keep in mind that American / Western users' use cases are very different from the farmer who can only afford a cheap android phone, being able to come online finally to check on crop prices so they won't get cheated, send pictures of their crops to prospective buyers, and even do things like checking the weather.


How is allowing my students, older relatives, etc - who wouldn't be able to use a laptop or desktop machine - to go online and talk to their children abroad using apps like whatsapp - disempowering?

It's the tyranny of the "minimum viable user" [1]. By making information systems that are safe enough for your students or their grandparents to use, the companies have to take away the customizability, configurability and assorted "sharp edges" that make those systems useful for more advanced users. I say "have to" deliberately, because the counterargument of, "Why can't they make both," never seems to describe a real world system.

Whether its Windows, MacOS or heck even Gnome3, the more an information system attempts to cater to the needs of novice users, the worse it becomes at catering to the needs of advanced users. And the terrifying reality is that inexperienced users outnumber us hackers by two orders of magnitude (or more).

[1]: https://old.reddit.com/r/dredmorbius/comments/69wk8y/the_tyr...


It's pretty obvious that everyone who works at Microsoft (specifically) hasn't got the motivation for personal empowerment or computing, or otherwise hasn't got the power to enact such. The end result is the same: a tech corp that oppresses the end user wherever feasible.

Some points to back up my contention:

- Microsoft does not allow the Windows serf to uninstall Edge on Windows 10.

- Microsoft is hostile to the idea of allowing a Windows serf to self-sign their own TPM.

- Microsoft does not allow the Windows serf to inspect the code that arbitrates their computing.

- Microsoft does not offer the Windows serf any ability to inspect, reject, or roll-back "updates."

Understanding these harms helps my personap situation by enabling me to make informed decisions about avoiding serfdom to Microsoft and their legion of cog-like engineers, which is definitely to my benefit. Microsoft engineers actualize the harmful policy of "Windows users deserve less;" therefore, the sniveling SWEs at Microsoft deserve less.

Part of that less which I shall forever withhold is my money and my endorsement of their basic capacity to ethical computing. Microsoft engineers are simply not respectable with regards to any ideals of user empowerment they may pretend to hold.


Backblaze is great, but it’s not a replacement for time machine (and I use both).

Being able to buy a new Mac and make it into an exact copy of the old Mac by using time machine is amazing - I’ve done it a few times over the years, and it still impresses me every time. Plus I have time machine backups from 2011 which I occasionally browse to see what my kids were up to then.


I found out the hard way that Backblaze, despite their claims of "...automatically back up all your files including documents, photos, music, movies, and more. It’s that easy.", do not back up .dmg files (among others) by default. Which seems to be a huge omission considering the numerous use cases outside of just application installers.


I'm going ot go out on a limb and say that for the majority of users, any dmg they have is a software installer they could easily redownload. so a waste to backup


No need to go on a limb, .dmg files are usually software installers. But if I'm paying for a backup service that claims to back everything up, I want them to do exactly that, regardless of whether it can be re-downloaded or not (which takes additional time, not to mention figuring out what was missing from the supposed "everything" backup in the first place).


Go into the settings and remove it from the exclude list then. Storing a billion copies of Spotify.dmg is very wasteful


How about just not claiming to backup everything in the first place if it isn’t the default case?


Complain to their support rather than not even open the settings panel before complaining on "Hacker" news, come on

You are a different target market, always check for silly normie settings


I'm surprised that backblaze doesn't checksum the files on the user's computer to see if they already have it on their servers to avoid duplicate uploads.


Dropbox had to quit doing this 10 years ago

https://news.ycombinator.com/item?id=2478567#2478608

Apparently Dropbox notices they already have that file, and instead of you uploading it they just make it appear in your account.


The first time that backfires, though…


They apparently do but only within a users computer (or account, I don't remember). I would be pretty uncomfortable they comparing my files checksum to a master list of sorts


This is exactly what Dropbox does: they hash a file, compare with all known hashes, and deduplicate accordingly. The file is then encrypted with the hash of its contents.


The majority of users have no backups at all. The ones that do either use one of the big three cloud syncing things (Google Drive, iCloud Drive, OneDrive) or use a local backup (TimeMachine). Everything else is just a niche. (unless you want to pull in corporate users)


That is very disappointing. Do they block ISO files too?


I don’t think it’s a matter of blocking, they’re just filtered by default. You can edit the filter list though so those files are backed up, and that’s what I do.


If I had known about their filter list, I would've done exactly that. But their implication that everything was backed up from the beginning by default is what got me.

ISO files are included in that filter list, among other files like VMs and such. Seems like a massive exclusion net to cast when claiming everything is backed up.


Backblaze has a bunch of sheisty rules put in place because they're afraid of supporting data hoarders, and want to minimize the amount of space users take as part of their 'unlimited' plan, so I am not surprised.


Nothing is being blocked. They have a default exclusion list which works very well for the overwhelming majority of users. It prevents people spending forever backing up large downloads that they can just get newer versions of.

Removing these exclusions is very easy and obvious.


Time machine isn't that great as a backup system and has issues. Carbon copy cloner is a better system to create snapshots to restore to as a local back up system IMO.


Why is TimeMachine not great? „Has issues“ can be said about pretty much every software out there …


+1. File History is the Time Machine equivalent for Windows. It works great.

It should be used in concert with an offsite backup service like Backblaze.


Awesome! When I've moved windows computers it's not easy getting to back the way my old machine was.

Can you link to a step by step on this - I think this would help a lot of folks.


Rather than rehash, I'll point you to Scott Hanselman's excellent (and old, but still valid) article on Windows backups.

https://www.hanselman.com/blog/is-your-stuff-backed-up-recov...

Check out his YouTube videos as well, for a more visual walk-through.

https://www.youtube.com/channel/UCL-fHOdarou-CR2XUmK48Og


Does that allow you to set up a new PC as an exact copy of the old one? Because Time Machine does just that, and I’d really like to find the Windows equivalent.


Probably an unpopular opinion, but I actually prefer to start from (nearly) scratch on a new machine as it gets rid of all the accumulated cruft. I use github + stow for my dotfiles, firefox sync for syncing browser history, bookmarks, etc and rclone + backblaze for backing up whatever is important enough. Beyond that, I don't care if I lose anything on the old machine.


Same. Every time I've tried copying stuff to a new machine via Time Machine the new machine feels unstable. There's so much stuff in ~/Library/Application Support type places that really shouldn't be copied so I just start the new machine empty and copy a few folders over (Documents, .ssh)


I don't like it but it does still feel good to start fresh. I do it in linux as well.


Isn't that easily accomplished by storing a disk image? (e.g. clobezilla, ssd migration software, snapshots,...). For that the OS is completely irrelevant.


There are two backup solutions on windows, 'windows 7 backup' (yes, wtf) allows you to make exact copy and clone it. There are also paid solution.

File history is for all your files, but it is amazing at it. You can roll back to a particular version from particular date


> There's FLStudio (or Bitwig, or Reaper) to match Apple's Logic

Plenty arguments against Logic, but these three are not in the same league.


For some workflows, Logic simply cannot do what Live or Bitwig or FLStudio do.

For other workflows, Reaper and Logic can do things that are essentially impossible in Live/Bitwig/FL Studio.

There's no best DAW for everyone, only best-DAW-for-the-workflow-you're-using-today. Sometimes that's Logic, sometimes it's not.


Examples of what can't be done in Logic and what can?


Not in logic: follow actions for clip launching.

Not in live: logic's environment (you could do this in an M4L object, but not live itself)


Can't speak for the latter two but FL 20 absolutely steps up to Logic. And they keep adding stuff to it, for free. My license from like 2005 is still valid.


There is no windows for phones (anymore). If you aren't using an apple phone, you are pretty much forced to use google's android (or one of the various linux phones with little financial backing)

Windows/Android can be considered a natural pair, Microsoft seem to think so. Personally use Linux/Android pair, and I've been vaguely considering switching to an iphone to avoid Google.


Windows/iPhone isn't a bad pair either, provided you're not drinking the cloud services koolaid


How do they integrate at all? Windows has ability to link to a android phone to view texts, place calls etc, but no such link exists for iPhones and Windows.


Or if you want to do any form of app development.


If I ever wanted to get into iOS development I'd probably purchase a cheap Macbook Air for the bare minimum of stuff and use one of the multiplatform SDKs that can produce iOS builds alongside every other platform


My personal preference for a DAW is Studio One, it’s lovely. I like the efficient UI and it has all the features I ever wanted.

One of the sad ironies of my life is that once I switched to macOS and could’ve avoided all the annoyances and instabilities I’ve had with audio on Windows, I just never felt like making music again. Life is like that sometimes. :)


+1 for Reaper. Fast and fantastic piece of software. I switched from ProTools and have never looked back.


Of course I agree with your point that there are many options when it comes to switching. I tried, but clearly failed, to imply this very point. For me, after consideration of my requirements and tolerances, Apple seemed like the best fit, but I wouldn’t suggest that it has any kind of broader implication for the state of various software ecosystems.

Suffice to say I think all of the software in your list is pretty great and I chose what I chose for reasons that are wholly mine!


I switched to Apple from Android (still 100% Linux on desktop/laptop though) a few years ago, and I've grown to like it more than any Android phone I've owned...But it still sucks.

It feels like everything is overflowing with mediocrity, and there's no hope of it ever changing. I often have an idea for improving something that I see as being extremely shitty and perfect for improving/innovating, but there is no possible way for me to turn it into a successful product/business because the monopolists wouldn't allow it.

For example, why is the app store experience that same boring crap it has always been? Why can't that be fun, with like user profiles, Amazon-style reviews, Steam-style community features, social media-style feeds so you can sub to a developer, TikTok-style feeds of promo videos to discover new apps/games, etc. There's so much to explore there, but it's never going to happen because Apple doesn't allow it, and Apple themselves isn't going to do it because they're a monopoly and they have no incentive to (same goes for Google).

What are the chances that our government will wake up and do something about this limp noodle we call a tech industry? Or that some startup will "disrupt" these monopolists with something better and resist the multi-billion-dollar acquisition offers (backed by anti-competitive threats).

This makes me want to grab my old thinkpad and disappear into the woods. I'll become a hermit, using only my own+libre software and only returning to civilization when I need to replace the battery pack.


> Why can't that be fun, with like user profiles, Amazon-style reviews, Steam-style community features, social media-style feeds so you can sub to a developer, TikTok-style feeds of promo videos to discover new apps/games, etc.

Similar to the other comment, I read that with sheer horror. I specifically don't want to spend a lot of time in any app store. I want to find quality apps I'm looking for, download them and leave.


> It feels like everything is overflowing with mediocrity, and there's no hope of it ever changing.

I’d definitely think that about the Apple Store if they introduced some of the features that you suggested — that’s part of the problem. What one person considers an obvious innovation another person considers yet another newfangled UI design.


Did you just argue that Apple should have more than one store?


No.


What's "fun" about following the developer of a particular piece of business or process support software?

Apple Store already has star ratings, user reviews, and links to publisher web sites. I don't use any of the social/community features of Steam, and I can't stand the amount of review-stacking that happens in Steam and Amazon.

The one area that Steam is useful is their discovery stream where Steam will show you more games that they think you might like — the catch being that if you like one RTS game, Steam will show you lots of RTS games and never show you Mass Effect, Torchlight, KSP, or Dad Dating Sim.

As to why Apple doesn't have a discovery stream, just look at Steam's problematic system and you'll understand: Apple doesn't have it because they haven't figured out how to do it in a way that makes sense and will help people discover apps that they'll enjoy.

There the Apple Store suffers is in discoverability: in some cases I've been looking for a specific app and the thing I'm looking for comes "under the fold" because a dozen other apps have paid to get higher rankings for that search term.

I do not want TikTok-style feeds of promotional material clogging up my App Store. I want to get in there, find the thing I want, then get out. I do not spend my life looking for inspiration from apps in the store.


Everybody has different interests and needs, but personally, I hope they never implement your suggestions for the App Store. I , like I’m sure a vast majority of the Apple user base , just want an App Store, not something filled with social features.


Not really the same. They switched from a proprietary platform to a free platform. That is a much larger jump than simply switching proprietary platforms with much larger costs and gains. They wanted control and switched to get it, whereas you just swapped out landlords.


So first of all, I don’t think it’s true that the article describes a switch to 100% free software. For example, the author bought a mirrorless camera, which is going to run proprietary software. Second, and more importantly, I think your comment implies that the only lever of control is switching between free and non free software and I don’t agree with this. For example, I am now protected against Google terminating or otherwise locking me out of my account whereas I was not previously protected. Even though I disagree with your comment, I appreciate that free software has some advantages over proprietary software and I would not dispute the claim in general.


I would strongly advise you to think twice about switching to Logic. Apple has consistently dumbed down that product since they bought it, turning it entirely into a loss leader to sell you the Apple ecosystem. Logic was great when it was Emagic's baby, I wouldn't touch it with a barge pole now. It is the archetypal example of a product steered by executive strategy instead of user interests.

If you need a low cost vendor neutral DAW, Reaper is the way to go. I'm a semi-professional musician, and well compensated software architecture consultant, so cost is not a factor in my software purchases for music, but when I want a linear-paradigm DAW, it's Reaper. That product is dope and the company is awesome.


huh, ok. Can you say more about where you think Reaper outshines Logic? I have to say that as a non-pro, I was really impressed by what Logic offers for the price, especially coming from Ableton which seems to offer considerably less for considerably more money. My main use cases these days are: 1) hooking up my electronic drum kit via midi and using Logic's built in drum kits (these sound pretty good); 2) recording and processing spoken word. For both of these use cases, Logic seems superior to Ableton, which is the software I am most familiar with. I think Ableton's interface and user experience are vastly superior to Logic's overall, but for these use cases that difference isn't worth paying for. Thoughts?


reaper was built by programmers who took advantage of everything software can do, so you can a) script the hell out of it b) customize it like crazy c) route anything anywhere. So for building power user interfaces for linear editing it's great. It does not have built in instruments. Personally, I use Rob Pappen's Punch and Addictive Drums 2 for 90% of my drum needs so I don't miss them!


> I was so tired of trying to keep up with endless UI changes in Windows and Android.

iOS and MacOS both have endless UI changes as well.


I use keepass for passwords, and save the password database inside OneDrive


It's not like you have two choices only. Linux!


In what universe does switching to a Mac to use Logic make more financial sense than buying a license for Ableton Live?

For what it's worth: if anybody else is facing the same conundrum, choose an OS-agnostic DAW like Reaper, Renoise or Bitwig, then it doesn't matter what OS you're on.


I done this with Affinity Designer and now when I successfully leaved marketing and public relations driven Apple, this decision is paying off nicely.

Running Affintiy Designer under Windows 10 VM is non issue.

OS-agnostic path is the only long-term way of working.

Now under Linux I use Resolve, BitWig and Reaper, Blender, Inkscape, Krita, Emacs and I feel confident that no corporation will dictate my workflow in the endless pursuit for world domination.

The reality is that we are living in some form of economic stagnation, worldwide changes are ahead and people will be poorer, this will drive Apples vision for 1000 dollars + smartphones and overpriced computers out of place. I have converted my business to ARCH and in the process saved a ton of money (previously dedicated to buy Apple Silicon). Now this money will be spend on my engineers not on some half-assed marketing plan from company out of touch.

The only software that kept me inside Apple Ecosystem was Sketch. Since pandemic started for good or bad Figma is taking Sketch market-share, I don't like SaaS for design paradigm, but we adapt to clients requirements not to our own taste. I used Inkscape for interface design long before Macromedia Fireworks to offer a specific solution. So I am set for the future.:)


A fellow person of culture and taste I see! :- )


This is The Way:)


+1 for Pixelmator Pro. I just looked through my tool set (Figma, VSCode, Notion, a default browser, Pixelmator Pro), and the only Mac-specific app is PP - and it's awesome.


It really comes down to individual circumstances. I myself switched from Logic to Ableton just because it is cross platform(I have a PC and a MacBook)


I don't follow about Ableton, I'm still using 9 and it works just fine. You don't need to bump Ableton versions.


...okay; but Logic provides its updates for free (since X), is cheaper; and provides a wealth more content than stock Ableton does in the form of Logic’s downloadable content.


Pixelator pro is awesome


I found this bit very interesting

> I realised that my life while using Apple products is > controlled by Product Managers/Owners who want to get a > raise, rather than by technology people who share the same > passion as me

Mainly because I feel exactly the opposite. I don’t find Windows, Linux or x86 technologically exiting anymore.Apple makes (IMO of course) the most technologically exciting CPUs, their GPUs are a breath of fresh air, I love their approach to UI, APIs and OS security.


OS Security ? NSO has what looks to be an infinite supply of remote arbitrary code execution exploits for the Apple ecosystem.


NSO has an infinite supply of exports for pretty much every platform. It doesn't really matter what you run, if you are a high-profile target and don't take active counter-measures, you will be hacked. That's not what I am talking about.

Instead some things that have every day practical relevance for me:

- read-only mounted system volume that prevents me from doing stupid mistakes and accidentally sudo rm important files - application certificates - all executables being code-signed to prevent tampering - built in zero-knowledge password manager with automatic synchronization across all my devices - full hardware insulation for DMA devices on Apple Silicon


> It doesn't really matter what you run.

What are you talking about? You have nothing to hide? You are so addicted to Apples "solution" that as a tech savvy professional you cannot "countermeasure" against.

Practical? What is practical in knowing that your trusted computing device is easy target? Did you not understand that "power of association" with high-profile target by false positive will ruin your life forever? And this "automated" processes will be "included" in all commercially viable OS's.

IDK but sticking with FOSS and practical knowledge looks like better solution than trusting "whatever magical <Big Corporation> tech. But who am I to question the Apple fetishists of current day. After all in the past I was in this camp. So be happy with whatever "rationalization" you come up with. Everything is fine and dandy, the Big Apple is taking care of you.


Easy target as in „having an adversary that has millions of dollars to pay a company specialized in device hacking“? I think I’ll live. The idea that using FOSS makes you more secure is naive at best.


Using FOSS makes me more secure because of how I use it.

Because of availability of control surface, absent in Windows and macOS and most importantly the Kernel access. Yep, there are security problems in any os, but to compare custom build Gentoo with your beloved Apple toy, please grow up:) Your argument is funny and nonfactual.

To trust that a company witch publicly cooperates with oppressive governments and creates on device scanning software , breaching all "privacy" promises, helping with your security is absurd.

I have used exclusively Apple computers since early 2000s, and can pinpoint the moment in which all that I loved ended.

The moment when the iPhone was born.

Since then fighting with Apple telemetry was "business as usual", the existence of Little Snitch is all the proof that you need.


Not just the Apple ecosystem, pretty much every ecosystem, especially Android and Windows.


Not sure why you're being downvoted, you're right. HN commenters just love to take every opportunity to hate on Apple.

Nobody said Apple's security is perfect. People's views are simply biased because when Apple's much tighter security is breached, of course it will make headlines everywhere. Windows and Android have far more malware, but you won't see headlines about it every single day.

Either way, unless you're a journalist, politician or some other high-value target, you're pretty unlikely to be targeted by exploits like the NSO one. But if you don't care about jailbreaking, you can still update your device to give yourself peace of mind.


Zero days are more expensive for Android than for iOS. That says something when Android is much more likely to be used by a target. Android security is broken but it is generally better than what Apple does.


For a long time actually there were TONS of android exploits - either direct android or based on the installers companies put on phones they shipped.

That has changed massively recently, my sense is rough parity currently (on a clean android phone).

Payouts are still a lot higher per point marketshare for iOS though, so maybe still harder to develop.


Why say market share? Exploit payouts per exploit are highest for Android in mobile space: https://zerodium.com/program.html


Normally if a platform has relative low market share - the value of an exploit is relatively low (ie, 15% market share is pretty low). An android exploit gets you 85% of your market, maybe even more internationally where these get heavily used.

Android is about $30K per point market share and iOS is around $130K per point market share (4x more).

For example, it's not that Exchange / Outlook are HARDER to get an RCE on that makes the payout so high for those, it's because they have a LOT more usage than something like postfix etc.


I mostly agree. For example, I think that Metal doesn't get the love it deserves outside of the dedicated macOS engineers. I feel like Metal is substantially cleaner than OpenGL and way easier than Vulkan, and I feel like the Linux folks really ignore this at their own peril.

I hate how closed-off a lot of their ecosystem can be, and the obvious "I prefer open source" that every engineer says, but I do honestly (and sadly) think that macOS is the least-bad *desktop OS out there right now.


Metal is irritating because of its exclusivity. Nobody wants to make games just for macOS. The supposed ease and speed of writing Metal is completely lost since you still have to write a Vulkan or OpenGL renderer. It's just more busywork for anyone not exclusively targeting Apple platforms.


That's a fair criticism, though is there any technical reason someone couldn't write a Metal->OpenGL or Metal->Vulkan or Metal->Direct3d wrapper?


Sure, there are already open source projects that do Vulkan -> Metal like MoltenVK.

But any time you are emulating one complex low level API with another, there are likely performance penalties. Not to mention you are raising the complexity of the software stack and adding more surface area for bugs.

It would be better if there was an open source low level graphics API that every OS could support in addition to their own proprietary APIs focused on ease of use.


> It would be better if there was an open source low level graphics API that every OS could support in addition to their own proprietary APIs focused on ease of use.

Funny enough, that would be Vulkan, right?


Yep


I guess someone could, but it ends up being more practical to use a translation layer (which inevitably ends up having at least some overhead) on a niche platform, while being able to wring every last possible bit of performance using Vulkan on the other platforms. Hence MoltenVK and not VulkMetal. Or maybe it's just about the momentum, which Metal doesn't have.

Note that WebGPU has been heavily inspired by Metal (AFAIK), and it uses translation layers on top of a number of APIs (whatever is available on the system). And it's not limited to browsers, despite what its name might suggest.

So it seems like your best bet to try something Metal-like that works across platforms.


I think the reason that Metal is not interesting people is that you can't run it outside Apples ecosystem. If you learn Vulkan, your code will run everywhere.


Sure, more or less, I'm just saying that the API for Vulkan feels pretty steep and irritating, but the API for Metal is somewhat approachable.

I'm not a graphics programmer, but whenever I've gotten the itch, I've always gotten huge headaches trying to get Vulkan to work, and had basically no issues at all using Metal.


Maybe Apple should enter the world-changing business instead of presiding over the money-making one, then.


Metal is a beautiful API and as a hobbyist GPU programer I think Apple GPUs are what's really under-appreciated. Enthusiasts usually dismiss them as "mobile parts" and turn the blind eye to the very interesting features they bring:

- TBDR with user-programmable persistent GPU caches allow you to do some really cool things smartly, drastically cutting down the amount of work and memory fetched - GPUs are trivially exposed as what they are: machines with very wide SIMD ALUs - resource binding graphs with full support for pointers and indirection; resource bindings that can be created and populated on the GPU - sparse resources that actually work and are performant (nobody uses them on mainstream GPUs because they are apparently slow as f** there) - etc.


Sure the CPU is great, but the hardware is just too unrepairable and disposable. The software is also too much about Apple lock-in. For example, Apple Music can't even play FLAC files.


> the hardware is just too unrepairable and disposable

Pretty sure it was before the millennium when computers became cheap while the value of repair skills increased, it was no longer cost effective then to have the old malfunctioning computer of any make repaired. I expect you mean that you want to repair it yourself, but if you are skilled enough to do that, your time is valuable, so it hasn't been worth your time for at least two decades to spend even 10 hours a year repairing a machine more than about 3 years old. And AppleCare is worth the cost of extended warranty, so you don't have to repair it.

> Apple Music can't even play FLAC files

That's just that software. VLC runs on Apple Silicon, so you don't need to use Apple Music, but if you did and want to play FLAC files, you'd need to transcode them to ALAC, Apple Lossless (or mp3 or aac for lossy). You could build and script ffmpeg to transcode all your FLAC files to ALAC in one CLI command, because lossless is lossless. I'm kind of curious just how fast Apple Silicon will do that... seconds I bet, because it takes only a couple minutes for my Core2Duo to transcode a full-length album of FLAC to ALAC.


> it hasn't been worth your time for at least two decades to spend even 10 hours a year repairing a machine more than about 3 years old.

The median hourly income in the United States is $20/hr. Buying a new computer every three years is certainly going to cost you more than $200 a year, unless you buy extremely cheaply. And that's leaving out the massive environmental externalities of not repairing older equipment rather than throwing it away. Also leaves out the fact that most people have free time that they can not trivially trade for more working hours.

Furthermore, 10 hours a year seems ludicrous. Apple bricked my MBP with the Big Sur update (along with many other people's laptops, it was a bit of a scandal). I repaired the rather infamously not-repair-friendly Apple device in about half an hour with a part I bought on Ebay for less than $10.


The market sets the rate for onsite computer repair between $50-$100 an hour; those hours are not prorated, so 10 minutes could be 100 bucks, and 3 separate 10 minute repairs could be $300. Cost of a support call will include the time it takes to figure out what the issue is before the actual repair is implemented, not to mention the cost of new parts if necessary. And generally, software problems will take longer to resolve than hardware problems, and there will often be more than one problem. Solutions take time, and when that time adds up to half the cost of a 3yo computer when it was new 3 years ago, it would have been more cost-effective to purchase a new machine than invest repair costs in a machine that could die forever next week with no warranty. You could get lucky, but you could also get screwed. With new hardware under warranty there is less risk of the latter.


I'm responding to the following in your parent comment:

> I expect you mean that you want to repair it yourself, but if you are skilled enough to do that, your time is valuable, so it hasn't been worth your time for at least two decades to spend even 10 hours a year repairing a machine more than about 3 years old.

So your claim about the cost of onsite repair is completely irrelevant. We're talking about the cost of repairing something yourself, which you very frequently can do, even when you're working with something as unrepairable as a Macbook.


> it hasn't been worth your time for at least two decades to spend even 10 hours a year repairing a machine more than about 3 years old

I replaced the battery in my X1 Carbon using a screwdriver in about 15 minutes. No significant downtime or hassle.

> You could build and script ffmpeg to transcode all your FLAC files to ALAC

Hardly "Just Works" though is it? I might as well stay on Linux...


I'm not sure battery replacement falls under the category of repair, more like required maintenance.

If you run linux, sooner or later you're going to want compile software from sources. Building software on linux is hardly if any different than building software on any other *NIX platform. Developers and package managers have made the process easy. So whatever your pleasure, please indulge.


> I'm not sure battery replacement falls under the category of repair, more like required maintenance.

Which would suggest that Apple's glued-in batteries are a bad idea? Whatever one chooses to call the process of changing a battery, this is unnecessarily difficult with Apple hardware.


I love my 2011 MacBook Pro's hardware - the case, the keyboard, the hinges. I liked OS-X Lion just fine, and every update since then has seemed to be a downgrade...I still need to use the 2011 Office I bought then now and then, or I'd have converted it to Debian years ago.


I have a really beefy System 76 laptop but after using the Mac Developer Transition Kit, I'm eagerly waiting for the Apple M2-based Macbook Pro. As much as I love Linux and linux-desktop, I miss the polish the MacOS has.


I'll give you CPUs but I hate their "dumb it down" approach to UI, and Apple's only passion seems to be coming up with new ways to lock the user in and stop them from doing what they want with the PC they bought.


In the past people used to trade their freedom for shiny objects. Be careful.


What is this „freedom“ you are talking about? Are you suggesting that my experience somehow lacks freedom?


For now you are only being watched and manipulated, but Apple is already working on ways shiny device will snitch on you. The progress for the shiny object to become a leash and a whip is slow and steady. You don't want the frog to jump.


Pah, folks have been predicting doom glom and total lockdown on Apple systems for over a decade now, and so far, not a single step has been taken towards it.

In fact, I feel like my freedom is more practically restricted on Linux, with its' general hostility agains distributing compiled code.


Not a single step? Please install trial of Little Snitch on your beloved Apple computer and observe. I see you don't like to listen, you just casually speak nonsense.

If this tread is showing reality of the level of critical thinking by the tech educated people, we are living in 1984 in full swing.

Attacking the author for sharing personal decisions which have factual data, disregarding arguments with personal "ease of use" points. Just totally out of touch and reality.

And when I think that most of these people are producing the software solutions of today - OMG.


I'm afraid the battle is already lost my friend.


I think that you are right. May be some form of denial has blinded me, but watching first hand uncensored reaction here helped a lot to remove any form of romanticism. Greed trumps all.


What kind of ridiculous metric is "technologically exciting"? CPU/GPU speed is really just a cat and mouse game--soon enough an even crappier suite of Electron apps will make the M1 feel slow. Apple's UI approach is just terrible lately [0][1], and the OS security comment must be a joke in light of the CSAM scanning debacle--are you living under a rock?

[0]: https://www.cnbc.com/2021/09/22/ios-15-how-to-move-safari-ad...

[1]: https://news.ycombinator.com/item?id=27559832


Placing the url at the bottom is brilliant UI wise considering the height of the average smartphone nowadays. And anyway they made this an option for those who wouldn't like it. The “mess" that was discussed in your second link (the hovering URL bar) was only present in beta version and has been removed from the release. Finally, CSAM scanning has nothing to do with OS security, but with privacy.


I did not realize how much I needed this change in my mobile browser until I installed iOS 15 yesterday.

I already cannot fathom how we dealt with the URL bar at the top of these big phones for so long. It's been a long time since a major UI change in some popular software actually made me happy, let alone this ecstatic. I'm usually one to complain about unnecessary change (like Firefox's recent UI refresh)


Sure is - and Firefox has done that (while still allowing you to put it at the top if that's what you prefer) for several versions now.


> Placing the url at the bottom is brilliant UI wise considering the height of the average smartphone nowadays.

It's not at the bottom, but within the content viewport. Browsers need a dedicated "safe space" for notifications and symbols like a lock for HTTPS. A floating address bar over the content of the website could easily be faked using HTML/CSS/JS and abused for phishing.

Sure, put it at the bottom, but not as a floating pill over the content.


One of the biggest opportunities in technology right now is to re-run the Next/Apple playbook and create a new hardware/software technology company focused on developers. Good quality hardware married with a great OS / window manager. Most of the software we use to create software are electron apps, a thin layer of native code to run a bunch of javascript. The main notable exception is XCode, but the ecosystem of non-ios/apple developers is far larger. Wish someone would do this. I'd pre-order tomorrow.


If by "biggest opportunity" you mean "biggest opportunity to lose all your money and chase a pipe-dream" then I agree :)

'Next/Apple' isn't a quick playbook - it's an over 30 year R&D effort to create a hugely complex software and hardware business, and it spent about $100 billion in R&D to get its products where they are today (at the absolute cutting edge of technology). Writing your own modern OS and building/manufacturing good hardware to compete with this is difficult enough, and then you have the even bigger challenge of getting all the major software vendors to support your new platform.


"'Next/Apple' isn't a quick playbook - it's an over 30 year R&D effort to create a hugely complex software and hardware business, and it spent about $100 billion in R&D to get its products where they are today ..."

But isn't this much, much easier if you just piggyback on the Apple hardware ?

I always expected this to happen.

Circa 2008 or 2009 I thought that any day now there would be a linux distribution built specifically foe one single Apple laptop. No hardware issues, no gremlins, no moving targets - you would have a (very) fixed hardware target and optimize just for that. Then I, as a user, could just go to the Apple store and buy a nice shiny device and install MBAlinux on it and call it a day.

I really don't understand why this never happened. Further, in many ways it seems that the opposite of this happened - installing linux/FreeBSD is weirdly painful on Apple laptops which is unexpected since we all know what is inside of them and the installed base is huge.

So I would suggest that you could, indeed, build a hardware/software ecosystem - just let Apple build the hardware part ...


Linux is a complete mess that may likely never get fixed. The problem is people. It is a representation of democracy: a messy combination of half-arsed solutions that forms a workable cohesive. This is not a valid competitor to the Mac. It is a compromise.

Lets take Ubuntu as an example. Today you can get Ubuntu laptops that will work out of the box. Is that true tomorrow? Absolutely not. The next distro version will break something in the hardware. I have been burned by this twice now. At then end of the day the Apple premium is not really a premium. It ensures that they continue to support their legacy hardware for years. The people who bash the premium as some sort of "idiot tax" are actually valuing the software that runs on the machine at 0$. There are too many people in this world that don't understand how much effort it takes to create and maintain good reliable software. You see it on the app store where people can't fathom spending 99 cents and you see it in the bashing of Apple devices.

Lets assume that your hardware works beautifully with the current version. Then you actually look at the apps shipped with the distro. They are poorly made and do not form a cohesive OS. You are forced to hunt for other open source equivalents to basic stuff like "paint". Have you tried using the calculator or notepad equivalents? They suck compared the simple and easy to use Windows and Mac equivalents. This is something even Windows gets right. It comes from the fact that Canonical does not have the resources to build each app around a unified design and UX principle so they farm it out to the "open source community".

Finally, why do each distro version seem to break something on the same hardware year after year? There seems to be a serious lack of regression testing on these distros. For 10+ years I have witnessed how one version of Ubuntu breaks some stuff, fixes others and then the next version fixes some stuff but breaks previous working items. Then it gets worse, the subsequent version breaks previously fixed stuff again! I am forced to QA the entire OS every time a new release comes out and hope I don't miss something(which I always do)!


You know what is a compromise? Running Docker at 1/4 of its native speed inside a VM.


My solution for this is a headless Linux box under my desk and remote vscode and ssh from my MacBook. Best of both worlds so far for me.

I did try and use Linux full time but the UI drove me up the wall so I’m back comfortable in macOS but my dev work flies on that Linux box.


> This is something even Windows gets right. It comes from the fact that Canonical does not have the resources to build each app around a unified design and UX principle

Isn't it a bit on the nose that you accuse Linux of failing at the one thing that Windows is notoriously bad at, UX cohesion?

More seriously -- Linux reflects a different mentality and way of doing things. It is not for everyone. Downloading the software you want is the expected way to do things. I have no idea whether Ubuntu ships a "paint" replacement, but regardless, Pinta or Paint.NET are like three clicks away, thanks to the software repository approach.

Linux is more or less for people who want to experiment and configure things their own way, and make software that solves their own problems in the way that they want those problems solved. Creating a single, opinionated, out-of-the-box working desktop experience with perfect hardware compatibility with whatever bullshit proprietary-blob using silicon is out there is (a) hard, and (b) not what most Linux-using developers are interested in.

The people who use Linux largely recognize that yes, it is a compromise, but also that using Windows or macOS also represents a compromise. Having used Linux for nearly 15 years now myself, I can say confidently that the trade-offs for me weigh heavily in favor of Linux.


>Isn't it a bit on the nose that you accuse Linux of failing at the one thing that Windows is notoriously bad at, UX cohesion?

Yeah you can criticize Windows for trying to update their designs with Metro and the like but in reality, all the old apps that worked cohesively are still there even today. Ubuntu and the Gnome or KDE based distros never had this to begin with. Just multiple flavors of the same cruddy base applications since all the distro are using the same apps anyway.

>More seriously -- Linux reflects a different mentality and way of doing things. It is not for everyone. Downloading the software you want is the expected way to do things. I have no idea whether Ubuntu ships a "paint" replacement, but regardless, Pinta or Paint.NET are like three clicks away, thanks to the software repository approach.

Yeah thats fine but that unfortunately makes it a non-starter if you are looking for a direct replacement for macOS or Windows.

Yes Paint.NET/Pinta/GIMP are always trotted out when I post this example. Pinta has been an unstable mess every time I have installed it. Plus "paint" is a near-instant loading app that is several MB in size whereas Pinta is installing loads of supporting libraries because it is a more complex application. Your telling me that in 2021 they can't just ship a simple app to allow a user to just jump in and use to resize images or add some text to basic images? This hinders the usability of the system when I can't just quickly do a simple task and move on! It is as if the developers of this distros have never understood how a regular user uses a PC.

>The people who use Linux largely recognize that yes, it is a compromise, but also that using Windows or macOS also represents a compromise. Having used Linux for nearly 15 years now myself, I can say confidently that the trade-offs for me weigh heavily in favor of Linux.

The only thing that is a given is that any comment bashing Linux will ultimately attract someone like you that tries to twist and turn my words to justify it. I've seen it for 10+ years now without fail so i'll leave it at that.

Its a shame because I have looked at the messy bug tracker for Ubuntu and have tried to fix issues but then I stop and realize what is the point when it breaks again in some subsequent version of the distro. I wish someone would just dump a bunch of money, hire former Windows/Mac devs and properly build a lot of the supporting components of some distro, then all the other distros can roll up those better apps and then we have at least something that can be called adequate in 2021.


> The only thing that is a given is that any comment bashing Linux will ultimately attract someone like you that tries to twist and turn my words to justify it.

The only thing that's a given is that any post that proclaims the relative merits of Linux versus alternative operating systems will immediately attract posts like yours bashing it, so shrug.


More seriously than my previous response.

> I wish someone would just dump a bunch of money, hire former Windows/Mac devs and properly build a lot of the supporting components of some distro, then all the other distros can roll up those better apps and then we have at least something that can be called adequate in 2021

My point is not that you're wrong to feel this way, but rather that you should recognize that "adequate" is ultimately subjective. Adequate for whom? Adequate how?

The Linux ecosystem is largely designed by and for people who are willing to tinker, willing to customize, who want to design software that scratches their own itches, and who aren't looking for a perfect out-of-the-box experience from a distro. A handful of people want to bring about "the year of Linux on the desktop", but they're a minority and even for them the interest is usually secondary to their own use of Linux.

There's no "twisting your words" required here. What you want is a near-perfect out of the box Linux experience. What most Linux users want is ... something else. My point is simply that that's okay. Linux doesn't have to be for everyone. Your problems with it are not everyone's problems with it. In particular,

> that unfortunately makes it a non-starter if you are looking for a direct replacement for macOS or Windows.

Most Linux users don't want a direct replacement for macOS or Windows. Maybe there's a class of "theoretical Linux switchers" out there who would switch and would be the majority of Linux users if they did, but they are not, at present, the majority of the people using and working on Linux.

What Linux provides me with is (a) a well-integrated package manager containing fully free/libre software, (b) a comprehensible system (where I can understand fully how each part works), and (c) a modifiable system (where I can change how the system operates to the extent I want). Having a perfect replacement of the MS Paint application is not even on my radar. But that said:

> Plus "paint" is a near-instant loading app that is several MB in size whereas Pinta is installing loads of supporting libraries because it is a more complex application.

Maybe you're exaggerating, but on my system Pinta has an installed size of only 2.88 MiB and has only two direct dependencies. Maybe you're thinking of the fact that it's written in Mono (the C# runtime), but that's a shared installation with all other Mono applications. It's equivalent to Windows shipping with the .NET runtime or UWP.


> But isn't this much, much easier if you just piggyback on the Apple hardware ?

Sure, it's easier, but then I'm not sure what the point is or what makes it one of the biggest opportunities of our time.

I also understand why it never happened - there is already a unix-based OS which is designed with perfect compatibility with the Apple Hardware called OSX! I'm not sure what the advantage to a consumer would be for replacing OSX with linux - other than the fact that it gives consumers choice - but of course providing a distro that only operates on a specific Mac is then limiting hardware choice so it doesn't really solve that in some respects.

And if it's just for developers, then wouldn't developers want some choice of hardware, good support for tooling, the ability to test native apps without virtualisation e.t.c.

IMO I suspect the Venn-diagram of developers who:

* want a Mac but don't want OSX

* don't mind that they can't upgrade their hardware

* are willing to run some totally-new operating system

* Accept that it will initially lack the support of the runtimes they use, and some software, and won't be able to develop certain types of software because of this.

* Accept that if they wish to continue using the OS for their next laptop they will be fully locked-in to a single hardware model.

is pretty vanishingly small.


In short, Apple does things in non-standard ways without explaining how to get another OS to work.

Apple doesn’t prioritize lack of binary blobs. The EFI firmware is all proprietary. All their Wi-Fi have been switched to Broadcom.

They do weird non-standard things to the Thunderbolt controller, e.g., you have to lie to the firmware and claim to be macOS in order for it not to disable the Thunderbolt controller.

Newer MacBooks hide a bunch of hardware behind the proprietary T2, and whatever embedded OS runs the Touch Bar.

MacBooks are not ideologically pure, and sunk efforts to get an OS working on other machines are often wasted on MacBooks because Apple does things in bizarrely different ways.


Steam Deck (For Games)

I only ever need a laptop when traveling, I have a big desktop setup at home. I plan to take my Steam Deck traveling with a portable monitor and keyboard.


I like the idea, but I worry that Apple's m.o. is to allow something like this in the margins and then cut it off at the knees if it becomes too successful. Whether by altering their hardware, using security lock-out (à la iPhone), or replicating it without acknowledging where it came from.


> But isn't this much, much easier if you just piggyback on the Apple hardware ?

Would that be even legal? I mean selling a commercial OS that would be marketed to install as a replacement OS on the most locked, most proprietary hardware on the market?


Didn't say it would be quick. Sure, it was a long cycle for them, but the ecosystem is much further along now. I think it's to bring to market in the hundreds of millions.


> focused on developers

The problem with this is which developers? People who write embedded systems? Web developers? People who write custom Windows applications?

Any given developer subset is likely to find this hypothetical new developer computer to be either too complex to use or not differentiated enough from Windows or MacOs (or ChromeOS).

> Most of the software we use to create software are electron apps

This is not true for most people whose primary employment is writing software, or working on software teams. Most people who get paid to write code work primarily in either the Java or .NET ecosystems and use something like Eclipse, IntelliJ, or Visual Studio. (Many more are using niche-specific tools in a captive platform like Oracle, SalesForce, SAP, etc.) If the new platform doesn't have 100.0% binary compatibility with legacy tools written for Windows and/or MacOS, its addressable market shrinks substantially.


I would also argue that focusing on developers too much is actually a loss for users. Developer productivity above all is how we've ended up with resource hogs like Chrome and Electron as well as never-ending erosion of customizability in software as well as the user's level of control and privacy.

It's critical to have a great developer story yes, but to make a stellar platform that needs to be balanced with a great user story, and that means developers might not always get everything they want down to the letter.


Agree to disagree. Java is inherently cross platform. Jetbrains stuff runs on any linux platform, as does vscode and any of the modern development workflow stuff. The development stack is steadily moving away from native apps. Vscode is essentially a webapp, and indeed it can run be run as one.


Fair enough about Jetbrains!

VSCode is not the entire stack needed to build Windows desktop applications. There are tens or hundreds of thousands of developers who build applications for the Windows desktop. I'm not (for the most part) a .NET dev, but my current understanding is that only Visual Studio running on Windows is a first-class citizen with the ability to access all parts of the dev stack. The Windows dev stack doesn't need to move away from native Windows applications any more than does Xcode need to move away from MacOS.


>One of the biggest opportunities in technology right now is to re-run the Next/Apple playbook and create a new hardware/software technology company focused on developers.

Why would that be an opportunity though?

It would be a low margin niche, with a small market segment, of which most would stick with Apple/Lenovo/Dell.


Software developers are a large and massively growing market segment. That's not niche.


There a tiny slither of the global population, which in no ways constitutes "One of the biggest opportunities in technology right now".


Lol. Ok. You're right. Congratulations! How's it feel?


And only a tiny fraction of them would care. I’m quite happy with my iPhone and don’t want a random HN freedom phone. I don’t ever even intend to develop something for my phone and if I did, I would want to target android and iOS.


Software ecosystems have a major chicken and egg problem.

If you wanted to create a new platform, your best option would be to go the other way and make an OS that was “just electron” and ran all the electron apps in the world faster and better than anything else. Unsurprisingly Google has tried this with Chromebooks, but their track record on consumer product development is so poor that perhaps they just didn’t execute well and someone else could pull it off.

Another challenge is that if you did that you gain wide software compatibility but you lose any obvious differentiator. The likely way to win would be if you could make a laptop that was “just as good at running web apps as your Mac, with just as much battery life and just as nice hardware” but somehow cost under $400 or so.

I actually wouldn’t be surprised if we see that coming out of Chinese OEMs in the next decade.


> but somehow cost under $400 or so.

Screens are the most expensive things on a laptop, mobile and tablet.

So you are not going to get a good screen


That would be the trick. Perhaps a screen manufacturer will realize they could make better margin if they built just enough “netbook” around the screen to sell it as a computer, or perhaps screens will just get cheaper until at some point a “good enough” netbook screen is very cheap.


Put Kubuntu on any modern Dell desktop or Thinkpad laptop. In 95% of the cases enjoy total hardware compatibility right out of the box, and a UI that will pass for "The next Windows" for most people whose needs do not exceed browsing the internet and Facebook properties.

If they need a photo manager, in my experience the most common application need after a web browser, then Digikam really cannot be beat.


While hardware may work, there are so many niceties people will have to learn to deal with. Plugging in an external monitor may or may not work. Audio may or may not switch like folks are used to. (Try telling someone to launch alsa mixer) Want to use bluetooth? It might work. If it doesn't, you're going to be messing with things deeper than a "Facebook/Internet user" wants to deal with.

I am a full time Linux user. And I'll probably support anyone who wants to try it until the day I die. I absolutely love it. But we still can't enjoy some of the simplest use cases without screwing around with configs and in some cases, writing scripts that listen to DBus or udev.... So every time I hear someone say, "just use Linux" I think... nah, just buy a Chromebook (and - yeah, use Linux). If your needs are any more than that, Linux might not be for you.


+1. I used to run Linux on my primary laptop. I'm a fairly competent Linux admin. I just got really tired of being forced to be a fairly competent admin so much of the time when I was trying to just get something done.

> writing scripts that listen to DBus or udev

Exactly. Not to mention the preceding step of spending 30 minutes in forums to find someone else who has had this problem on a system with exactly the same motherboard so you don't try the things that didn't work for them.

I'm still 100% Linux on the server, headless Linux doesn't have nearly the warts as GUI Linux.


What you're describing certainly doesn't describe my experience of the last ten years or so with Linux. Sure, there are some things that won't work with Linux by choice of the manufacturer/developer, but external monitors, audio, and Bluetooth generally just work with a Dell laptop. I've been using Dell laptops in a variety of scenarios for years and that sort of stuff doesn't even cross my mind any longer.


Reasons like these are why I switched to using virtualised Linux for most of my development, trying to run it natively on hardware works great 95%+ of the time, but that last 5% is usually tricky to fix without delving into years old mailing lists. And even if you can get things working, there's still often serious quality of life problems like the bluetooth stack randomly failing 5+ times a day


This is all true in Apple and Windows land as well. Especially if you are not "all in" on Apple and have non Apple things you are trying to connect.


> Plugging in an external monitor may or may not work. Audio may or may not switch like folks are used to.

> Want to use bluetooth? It might work.

Sounds like the problem of installing Linux on a hardware designed for Windows. All those things work flawlessly on my Purism Librem 15, which came with preinstalled Linux. (Ok, I did not try Bluetooth, but saw reports that it works.)


A single Gladwell of anecdata: I've got a thermal printer which talks Bluetooth - works flawlessly when sending data from my Macbook; outputs garbage when sending the exact same data from a Raspberry Pi 3+ (which is definitely Linux on hardware designed for Linux.)


That is all true.

In the context of the original comment, though, this does narrow down the market that can be addressed by a company running the Next/Apple playbook.

It seems vastly cheaper to solve these issues than to start a new company to produce its own hardware and OS.

If you did start such a company and prove there was a market, then you're making a bet that Canonical (or KDE devs) won't put you out of business.


100%. That's what Apple got right then and one of the reasons why Linux has never been able to really penetrate the desktop or professional desktop market. Otherwise you're constantly debugging things that should Just Work like external displays, random device drivers, etc.


> Plugging in an external monitor may or may not work

Sadly, this is now true for Macs as well. Were by "not work" I mean: not finding/not supporting the proper resolution and/or refresh rate for the display.


I had a similar experience related to this in the past couple of months. My company issued me a Macbook, and forced an upgrade to Big Sur recently. I'd been working through the pandemic by plugging the laptop into a HDMI monitor using a USB-C-to-HDMI adapter from Anker (purchased on Amazon: https://www.amazon.com/dp/B07THJGZ9Z/). After updating to Big Sur, MacOS refused to recognize the Anker adapter with a notification "USB Accessories Disabled", and a note that it was using too much power. I did a few hours of research trying to figure out what the power draw actually was (with no monitor plugged in) and scoured specifications for other adapters in the hope that I could identify one before purchasing that might work, but found scant information published on the power draw of various adapters.

I never succeeded. I just use the Macbook with only the built-in display now.


Agreed, patching/generating custom edid configs is not my idea of it just works...


I'm on a Dell XPS 13 with Ubuntu right now. People some times give examples of things like external monitors not working for them on Linux.

Here is one. Randomly, based on no relevant input from me or changes in the laptop's state, my network connection dropped and the Network Manager UI was telling me no network adaptor could be detected.

Some fumbling around in the Terminal (including various reboots not solving the issue), and managed to enable the wireless adaptor which apparently could be detected and connect to my network, though at the same time the UI was telling me in no uncertain terms that no wireless adaptor was connected to my laptop.

Then later, again randomly, based on no relevant input from me or changes in the laptop's state, the UI agrees there is a wireless adaptor connected after all. This is on a machine currently in near-factory state with certified compatible Ubuntu preinstalled.

I share this example because one can at least comprehend why random monitors or graphics cards or what not do not cooperate without fiddling, can comprehend certain apps failing and crashing, can comprehend other unusual bugs. The UI thinking and acting like there is not a network card for no reason whatsoever, on the other hand, is completely illegible to even competent users.

Someone needs to just commercialise a proprietary and at least initially closed version of Linux (so as to to turn a profit) with good design principles in mind and deal with lawsuits and license issues later. There is plenty of money in it.


Insurpassable barrier to entry. Not a chance.


What about the barrier to entry is insurpassable? Apple hardware in about a decade went from being something every developer I know raved about and loved to being something everyone complains about and is generally unhappy about. Apple went from being a computer company to a consumer electronics company. The more they expand into other consumer verticals (tablets, headphones, CARS), the more the computer products suffer.


Yet their computers are still massively popular with developers.

As for the barrier, it is in the software and app ecosystem.


I'd argue they're popular because no better alternative has been created (yet).

My point with the software is that most software people use to create software/apps is increasingly created with web technologies / electron apps, so the native app ecosystem on a desktop machine is increasingly weakening as a moat.


I'd say the #1 reason is that most developers (myself included) want the intersection of:

- Unix-like, developing on Windows is pure torture

- Don't waste my time with configurations, drivers and other crap, I want a machine that I can be productive with out of the box. Take my money if you have to, but I don't want to edit Xorg.conf ever again.

You can talk about polish all day, but no machine that doesn't satisfy both is even close to appealing for a majority of developers in my experience.

You are right about the native app ecosystem being less of a blocker, but that's in line with my point.


What are you talking about? Any large tech company could pull that off. The problem is those large companies are not interested in catering to a million developers, they are interested in catering to a billion people.


Remember windows phone?


I do. I remember Microsoft stores too. They went 20% of the way and stopped, and decided that extracting rent from Office and Azure was all the work they wanted to do, rather than continue investing in hardware and in person support. Same with Google and their devices, except they did not even bother with in person support.


>They went 20% of the way and stopped

For how much more would you siphon money into basically empty stores, to see it as having gone "100%" of the way?


Depends if I wanted to compete with Apple or not. I would have spent whatever it took. The board members at these companies obviously decided that cash now was more important than competing with Apple.


Maybe because they understand the sunk cost fallacy :-)


A sunk cost fallacy is only applicable if the assumption is the venture would result in failure.

Which is even more damning of Microsoft / Google management.


>A sunk cost fallacy is only applicable if the assumption is the venture would result in failure.

The sunk cost fallacy is not really about what the venture would actually do. One could be said to have fallen prey to it even if they double down and the venture eventually succeeds.

What's important is that at the time of the decision (a) the path doesn't seem to be working, and (b) they think "but I've spend too much to quit now".

This is more likely when one assumes it can still has a chance to succeed, than if they assume it will inevitably result in failure. Nobody that assumes inevitable failure would decide to continue.


I did not feel it needed to be specified since it is a trivial fact that nothing in life is certain. But Apple's product offering is a top to bottom customer experience involving in person help at stores around the country. Microsoft must have acknowledged that, since they went as far as opening stores and coming out with that line of non malware Windows products (as a side note, it is ridiculous that Microsoft even let their ecosystem get to that point). Which, yes, they might have had empty stores, but that is because they failed to continue investing in their mobile products, or even non mobile products. They would have had empty stores for 10+ years while they slowly build it all up, just like Apple had to.

All I know is at this point Microsoft had two options: continue investing into creating an alternative to Apple, or cancel their plans and sit back and let the Office/Azure revenue flow in.

Maybe it was a long shot, maybe they decided the size of Apple's customer base divided by two was not enough to satiate them, but whatever the case, they signaled that they do not have the talent/gumption/appetite for risk to pull it off. But if any company did have the opportunity to go for it, I would think Microsoft (and Google) with their income stream would have been in position to do it.

Both companies seem to dip their toes, but never follow through.


As a Mac guy I thought that the Windows Phone wasn't a bad device. My buddy had one and it had all sorts of great features but they all worked within the Windows/x-Box universe he was in.

I would have liked to see it succeed if only for there to be more competition.

It also had a great 'copy and paste' feature


Yep the phones were actually in terms of OS and hardware equal or better than iOS/Android. However even back then, the app ecosystem hurdle was already insurmountable.


This just proves it’s even harder. Even if by some insane luck you manage to build something that is very good, the general public still doesn’t want it.


>Even if by some insane luck you manage to build something that is very good, the general public still doesn’t want it.

This hits hard


It could've succeeded if they didn't reset the app ecosystem twice (they already didn't have many apps, but the resets definitely didn't help).


I'm pretty sure it could have worked, their phones were getting traction in Europe, if they did not do the big framework screwup which destroyed their developer base, it would have worked.


Yeah. No apps = no success. A developer-focused computer would obviously run Linux, so you'd have a large ecosystem right from the start.


Yet linux desktop is still where it has always been. A place for nerds who like to tinker.


I loved the UI and the phone offerings from Nokia. I wish they would resurrect it - and I'm a dedicated MacOS/iOS user.


>Any large tech company could pull that off.

The prerequisite of being a "large tech company" to pull it off is already a huge barrier to entry.


Have you looked into Purism (https://puri.sm)? Purism makes its own hardware such as the Librem line of smartphones and laptops, and maintains its own Linux distribution called PureOS. Purism also funds the development of apps (https://puri.sm/fund-your-app/).


> The Librem 5 is a phone built on PureOS, a fully free, ethical and open-source operating system that is not based on Android or iOS.

And physical hardware kill switches? This almost seems too good to be true.


The physical kill switches work as described, but yes, the phone is "too good to be true."

It will not (yet) replace your iPhone, although you can get pretty far on your own if you don't mind SSHing into such a device and messing with stuff on your own.


The catch is it costs more then an iPhone, gets about 2 hours battery even if the screen is off, and has a cpu slower than a $50 android.


Had not seen this. Thanks for sharing.


>One of the biggest opportunities in technology right now is to re-run the Next/Apple playbook and create a new hardware/software technology company focused on developers.

If it was focused on developers, it would by definition not be re-running the Apple playbook. Developers are much too small a market to justify that level of investment.


> Most of the software we use to create software are electron apps, a thin layer of native code to run a bunch of javascript.

I'd be interested to know what "we" means in this sentence.

Do you mean your team?

Or developers in general?

If the latter, I am not sure what industry you work in, but I very much doubt the majority of developers operate in the environment you describe.


i think a lot about this, these days, to be honest. re-run the next/apple playbook? no. build an integrated hardware and software company that fixes personal computing through ground up rebuilds of everything from processors to operating systems and programming languages to solve the annoyances of security, privacy, software bloat and the generally humdrum nature of new technologies being shipped these days?

maybe.

it would be amazing to build systems that are are so beautiful that they inspire people like berners-lee and carmack to do their things.


Fuchsia is really the only new OS under development that I’m aware of. And it’s not even clear if Google intends to ship it outside of niche embedded use cases like Nest products etc


Developers unhappy with Windows, Mac, and Linux is not a very big market.

Most developers I know are happy with their mac or windows spyware, or their rough edges linux rig.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: