I basically went back 5 or 10 years and replaced every "modern" technology solution. I pay now way more than I did with iCloud, but I am back in control. I am more productive.
- Listening to Music takes 3 clicks and just a few seconds
Wired headphones never run out of battery and have superior audio quality
- I can take real photos wiht high quality instead of relying on ever newer iPhone models for thousands of dollars which will always have lesser quality than a mirrorless camera for the same price
- I have fun again discovering bands and artists on Bandcamp instead of mind numbing listening to Apple Music playlists
- Coding via neovim on a terminal and just being on my keyboard navigating not only tmux and co, but also my OS is way more productive and faster
Which of these can I not do on a Mac?
Most of these seem like pyrric victories to me, as the Apple versions rose to success with all of these as competition by providing a better experience. I certainly wouldn't want to carry a mirrorless camera everywhere, nor deal with Bandcamp.
At one time, Apple clearly was a better experience.
Now, with features removed and new "privacy and security" features added, I'm not sure they are a great experience. For example, my Macbook reminds me of Windows Vista, except for worse, every time there's a system update and I have to reboot to permission the camera for a web conference.
I've learned to dread OSX updates because instead of adding new useful stuff, it seems like we just move things around, change out the icons and add some more intrusive "touch the fingerprint reader" authentications... plus I have to re-permission half of the apps and hardware just to do my job. Then there's the whole reboot, unlock, install, reboot, re-lock cycle. It's seriously worse than Windows Vista. Anyone remember the I'm a Mac/I'm a PC commercials?
Big Sur performance on older hardware has been a disaster though. Even on a $2500 MBP 15” from 2017. On M1 though it’s excellent.
However big sur runs alright on my 2015 MacBook!
If you're referring to the Kext/DriverKit changeover, this is just false - Kexts still work in Big Sur, they're just trying to slowly shift the ecosystem to DriverKit. Nothing has truly changed there yet.
If you're referring to a broken thing within Kexts/IOKit itself, that would be a pretty severe bug that would get attention within Apple. I feel confident saying this as I've reported bugs like this and they get appropriate priority levels.
Lastly, if you're writing (signed) driver code, you generally have access to resources within Apple to get answers to questions. I'm not even a large company and it was relatively easy to get in touch with those teams - and this stands in contrast to other teams within Apple.
To be fair, that also describes Windows. As best I can tell, every release since 7 has primarily focused on renaming and adding indirection to the ways you get to the same old control panels.
And I haven't ever had to reboot to allow a camera. I have several, ranging from a microscope to an SLR that also is my webcam.
I thought this was the UI museum feature where you can time travel back to Windows NT4 one layer of indirection at a time.
For example, you go to the proxy setup, the pac URL location is a tiny text box no matter how big the window, and if you have restrictions on your system you cant view the full address or copy it.
Also, in case this helps:
Curious what's wrong with dealing with Bandcamp? They seem just about the best place to buy music online. Majority of the money goes to the actual artist and you have a good choice of formats.
Agreed. Bandcamp does this right by offering V0 compressed MP3s, 320kbps MP3s, AIFF, FLAC, ALAC and I think even Ogg Vorbis.
In fact, I really like their catalog and match algorithm but I always thought apple music being outside of its own fence feels gross.
Bandcamp recommendations and artist/album similarity ratings are top notch, and there are many clients for the platform.
I personally can't use Spotify or Apple Music after taking advantage of Bandcamp.
Bandcamp also only takes a 10% - 15% fee for digital purchases, and 10% of purchases for physical items, which is an incredible deal for artists compared to what they make from streaming services like Apple Music or Spotify.
I can stick my iPhone 7 in my jeans pocket. My Olympus Pen-F (one of the "smaller" mirrorless cameras) with a small prime lens needs a pretty big pocket only some of my larger coats have. Plus it's heavy enough to pull on said coat and make it uncomfortable.
Yeah, image quality is pretty bad on the iPhone compared to the Olympus. But when I go out and about and don't want to have a bulky thing hanging around my neck / forearm or carry an extra bag, the Olympus' image quality is exactly 0. The iPhone beats that hands down, even at night.
Now don't get me wrong, I love my pen-f, and it's an incredible improvement over the DSLR I used to haul around before. But iPhones are getting pretty good for my needs now.
Now I am thankful my dad endured the pain and captured a lot of stuff on the JVC shoulder camera. At least the quality is pretty good even for 80's/90's.
We don't capture events like we used to anyway. No one holds there camera up for more than a minute. So we're left with all these short videos. I like watching the really long stuff my dad shot.
Also, most DSLRs these days have some sort of multi-exposure functionality built-in. Sometimes it's as simple as 'bracketing' (so you get multiple phones you have to merge together in post), but some fancier ones offer HDR functionality in-camera, so nothing to do in post.
That said, I now carry a mirrorless camera everywhere. The pictures are WAY higher quality and I have more control over the experience and the final output. Don't get a mirrorless if thats not important to you.
when I want to just take a quick selfie, I still have my phone. The iPhone is great at taking pictures but its never going to be a match for a mirrorless wihch has a enormous sized sensor compared to anything you'll get on a phone.
This item is the reason I am leaving the iphone and trying an unlocked/stock android device.
My music collection is a directory tree that I have curated and organized since 1996.
The correct way to deal with this is to move this directory tree onto my phone (either via network transfer or attaching a USB filesystem) and then browse those files with a music player app.
Anyone familiar with iDevices knows that every piece of the simple, standard workflow I just described is totally impossible.
Instead, you have to manually build playlists inside of itunes while "importing" your music (and storing two copies of it) and then transfer those playlists (one by one) to the idevice and ... it's just insane.
It is a workflow built for people that impulse buy a track here and there ...
On the contrary, any number of apps support precisely this.
- https://readdle.com/documents (not just music)
- https://apps.apple.com/us/app/sony-music-center/id724406878 (if you use Sony headphones)
There's no point in listing all of them.
Having ripped some 20,000 CD tracks a couple decades ago, I use several such apps.
As a user of such apps, I'd argue with "correct" though, given Apple Match with iCloud One combo and last year's update supporting high resolution / lossless.
Over time, I have come to use those apps less than Apple Match, which mirrors my rips using tracks from Apple's library where they have them, or uploads mine where they don't, giving me more seamless access across all devices, spoken access from Siri on HomePods, etc. Match was a debacle at launch, is now almost never wrong on even the most obscure tracks.
It supports FLAC obviously but also Opus. For my phone, I've ripped my FLAC files to Opus and can carry my entire music collection wherever I go. I used their import tool to build the same folder structure that I have on my NAS. I have iTunes installed on my Windows machine (no Macs at home) but I try to avoid it if at all possible.
There's a growing trend these days of making every single player app into spyware, and it's sad. I won't use reader apps or player apps that read or play local files that are going to transmit my activity off device for no reason that benefits me.
This is not completely true. You can store your iTunes collection wherever you like, organized however you like. You don't have to duplicate anything, although it's true that the default is for it to "import" it.
You can also create an "all my music" playlist which you can sync with the iDevice.
I used to have this setup with my music collection on Google Drive (because it didn't fit on my MBP's internal drive) and synced some of it to my iPhone. It worked well enough. The issue was more that all the music couldn't fit on the phone, so I had to pick and choose anyway.
The real gotcha is that iTunes didn't support flac, so I had to convert everything to m4a.
Yes, but then how do you deal with that enormous "all my music" playlist once it is in the iDevice ?
You can't browse by directory. You can't organize or display based on filename. So I guess I could parse all of the collection and transpose the artist/title/album out of the filename into mp3 metadata and then I would have a ... 30,000 track playlist ?
Again, all of this makes perfect sense if you're impulse buying a track here and a track there and if there is some way to move that "collection" to a new device every 2-3 years.
It's just not for me.
In my case, I organize it by artist / album / track number - track title; or by compilations. I then search for the album or the artist. I never have just random single tracks, so a directory is an album, which I have in Apple Music.
But I guess that you can't have any kind of organization you want, which is something that folders could give you.
> So I guess I could parse all of the collection and transpose the artist/title/album out of the filename into mp3 metadata and then I would have a ... 30,000 track playlist ?
Well, in the case of a meticulously managed collection, I'd expect the files to have correct metadata. Again, if this isn't the case, and you rely on file name / location, yeah, you're gonna have a bad time.
Just for the record, I've never bought any track off iTunes. All my music is ripped from CDs.
WAV files don't have metadata like mp3 files (typically) do.
I'm not saying I copy the uncompressed wav collection to my phone (~700 GB) but I am saying that my original metadata schema has all of the metadata in the filename:
Last, First - AlbumName - 01 - SongName - 3m25s.mp3
... and yes I could parse and reencode all of these populating their mp3 tags with these fields but, man ... what a load of work just because iTunes can sort by 50 different attributes just not filename:
... just look at all of those fine grained, fancy ways to sort by ... but the most basic attribute of all the filename is missing.
I feel it's my civic duty here to repeat: they do, in fact they even support ID3 tags, but it's up to the software developer as to whether they support them or not.
WAV tagging has grown in support in the past few years, but yeah, iTunes certainly doesn't support them.
That’s a screenshot from a decade-old iTunes. And it includes the option to sort by name.
I’ve just checked with the most recent version of Apple Music, and the wav filename appears in the title field and you can sort by it.
No, it is not impossible. It is very easy at least as Linux user.
There is a project for that, called as iFuse
It basically allows you to mount the Media chroot filesystem and the app specific sandboxed filesystems without a jailbreak.
Shouldn't take more than an afternoon of a reasonably proficient developer's time. I agree that it's not great that you should need to do this kind of thing, but it's less effort than switching to android ;)
I believe VOX Music Player allows you to upload your music without using iTunes - although it is using their cloud sync instead. Flacbox also seems to let you download and play local files from Dropbox, OneDrive, Box, SMB servers, DLNA servers, ...
It doesn't seem "totally impossible"?
iTunes uploads it to Apple's cloud, since I'm paying for iTunes Match; my iPhone and iPads pull it down. It is very easy to search for a single album on the iOS players; they also make it very easy to find your most recent acquisitions, with an automatically-populated "Recently Added" playlist.
~/Music/iTunes/iTunes Media/Music looks almost exactly like the directory structure you're probably meticulously maintaining by hand (lots of Artist Name/Album Name/01 Track Name.m4a), but adding new stuff to it is a simple matter of dropping a directory full of properly-tagged files onto iTunes. Then I delete the original files after iTunes has copied it into its directory.
It breaks down if 90% of your collection is a bunch of badly-tagged files you downloaded off of KazAa, but if your collection is a mix of stuff you ripped from CDs back in the nineties and made sure were tagged properly, and stuff you've bought that the musician/store tagged properly, it works pretty much seamlessly. Except for this one weird glitch where sometimes iTunes on my Mac decides that I have both the copy of a track I bought off of the Apple store, and one in the cloud, and thus plays every song off an album twice. I've kinda quit buying stuff from Apple because of this; I'll go to Bandcamp first.
What is the attraction of meticulously maintaining a directory structure that a computer can very, very easily maintain based on the metadata stored in the files? Why are you so married to explicitly browsing a duplicate of this filesystem on other player devices?
It is so bad that it makes me wonder who at Apple thought any of it was a good idea. Its not even 'bad from a certain perspective' bad but totally FUBAR.
I simply stopped trying to use my phone for playing music and it is the #1 reason why my partner wants to move away from iOS
But then, my metadata's in pretty good shape, so it barely even matters how my music files are stored. One big flat directory, carefully named folders, not that important.
To be fair, I guess I did have to convert the Flac to m4a. Drag to converter program, convert a ton in one go, drag those into iTunes. So that's one more step.
Or you can use one of the many other apps others linked to if you really want to stick to the whole filesystem thing.
I quickly got away from trying to organize stuff in the file system when I got my Personal Jukebox 100 (PJB100) - literally the first hard drive based MP3 player out there in the mid 90's. It heavily relied on MP3 tags so I developed tag discipline early on - and never looked back. Tags are WAY more flexible than folder structures. I couldn't care less how files are stored in the file system.
This conversation always amuses me - we don't complain the computer tracks all the parts of our files in a directory while we have no control over the layout of the files on disk - mainly because that's a level of minutiae better left to automation. For me it's a similar things with my music files. As long as my tag information is accurate (and since it's the first thing I do when I add something to my collection, it is) I can manage my music collection however I want irrespective of where the file is.
I will say they have no problem booting up a VM, and everything he is doing can be done in a VM with hardware accelerated graphics. So Full screened feels just like the real thing, as long as you are not playing modern games. Boot into windows for that.
That being said there are Windows Laptops that will give a Macbook Pro a run for the money, but when it comes to the $1k price range the Macbook Air is hard to beat. I like the ARM chips for basic users, but for a Power user the ARM chips are a step backwards. Unless you want to run ARM based Linux. With my many years of raspberry pi use, ARM has gone a long way on the Linux front. But not sure if i'd want to rock that as my main.
At the same time you can do all these things on Windows as well.... Really more of a personal preference. I have limited experience with Linux for Windows Subsystem 2, but it worked damn well when I used it.
> Honestly never seen a modern Macbook Pro run linux natively without major compromise.
When I was using Linux as my main OS, I was having this problem with all laptops. Desktops usually had many less issues, but were occasionally less than perfect.
For example, Lenovo, at the time, was hailed to be great for Linux compatibility. Well, for some reason, regardless of the distro I was using, I was more or less given an ultimatum -- I could have a screen with adjustable brightness or a consistent network card driver, but not both. Editing whatever file I found on various forms to "fix" the backlight issue (was permanently stuck at 100% brightness) would inevitably cause some sort issue with where my Internet connection speed would drop from around 1gbps to mbps eventually to kbps the longer laptop was awake.
It was an issue I never found a solution to, and the oddest part was that this was never an issue if I booted the OS of a live USB. Only when the OS was installed would the issue arise.
Various laptops I owned / used for work had their own issues with Linux too. I eventually just settled on using Linux in virtual machines / servers and never looked back. So, I definitely agree with your point about VMs.
I'd consider going back, but I really cannot/do not want want to sacrifice too much time getting distracted with making the OS bend to my will when I could use that time to be actually getting things done.
Dual booting on any hardware is nuts if you ask me, pick an OS and use VMs. If you really have to have a native OS but hate it so much you wont use it the rest of the time stick it on a different machine.
>I realised that my life while using Apple products is controlled by Product Managers/Owners who want to get a raise, rather than by technology people who share the same passion as me. And I wanted to change that.
Obviously they are "technology people who share the same passion as me" and are browsing throught HN.
Except they actually are? If you read their dev blogs, it's insane how much love actually goes into building the colour magic that goes into the cameras. Trully decades upon decades of love labour.
Instead at Apple camp they deliberately hinder UX for end-users if it means Apple gets to make some more profit this way.
I just see two companies, that want to sell stuff for more than it costs them to produce, but what would I know.
While this is purely anecdotal, I've observed a higher proportion of "mission oriented" product and even I.T. leads at Apple than at the other brands he adopted in this divorce. (He's not using Framework laptop yet.)
You can do all of them when you use a Mac, but Apple will do a lot to try to persuade you not to. That means you'll spend a lot of mental energy fighting with Apple's very effective p̶s̶y̶c̶h̶o̶l̶o̶g̶y̶ marketing department, and often you won't win. That's fine. You don't really lose anything, but every so often you'll think to yourself "I listen to the same bands all the time" or "I wish this phone had a bigger camera sensor." and you'll regret putting so much of your life, and money, in Apple's pockets.
I would have hopped by now that integration with dropbox, Sharepoint/Onedrive, Google Drive would have gotten better. But they are mostly barebones.
Granted Google Drive is barebones on nearly every platform.
I'm a big 365 user, and that is by far the best experience so far in terms of fluid use between devices and apps. Especially on windows, where even sharing a file doesn't require a Web browser popup. But oh man the admin side of 365 is overly complex, and easy to see why people use gsuite.
iCloud Sucks and has always sucked. Unless that is you are 100% apple ecosystem. Integration into my onsite server is easy with 365, google, and dropbox. Icloud is super clunky and really thinking about moving the Wife to something other than iCloud.
Apple music and Bandcamp are essentially the same thing, I personally use Spotify and I don't have to listen to "mind numbing" music. You can choose to let the applications play music for you, or you can choose your own music.
Spotify 10 years ago was exactly how I discovered old/new bands and artists that were not mainstream.
I don't see this persons logic in that regard at all.
Spotify to my knowledge and experience has an enormous amount of older music. Enough to make it the main source of music for my parents. I think this guy needs to realise that they don't need to listen to the curated playlists...
"Hey Siri, shuffle <playlist name>" - no clicks involved.
As for wired/wireless headphones. I wasn't aware that running Arch was a requirement for using Wired headphones, but then again, i moved to wireless headphones a decade or so ago, and have changed my habits to allow me to charge them while i sleep, as well as checking the product specifications for any headphones i buy to see if they actually match my expected usage pattern. But then again, i'm weird.
I don't think that is confusing at all, just a matter of degree between leaving all services of apple or additionally leaving their OS as well.
Which...I guess? Millions (billions) of people use all sorts of stuff. Good for them.
The "I'm leaving [some platform]" posts are always extremely low value pablum for a subset.
- dedicated hardware does a better job than general purpose one (cameras vs. smartphones)
- the smartphone ecosystem (regardless of brand) is becoming more closed, and monitored, every year
- there are enough alternatives out there, even if those mean to give up some "conveniences" we thought we won over the last years (IMHO, most of the perceived "inconveniences" that come from these alternatives are more due to lock-in and dark-patterns from FANG/MS and phone OEMs)
Somehow I've managed to own an SLR alongside every system I've used for the past decade+ (though it sits unused to a much greater degree given how vastly improved modern smartphones are...). I have a Lenovo Windows laptop, a Windows 10 gaming PC, an Intel MBP, an M1 Mac Mini, an iPhone, an iPad, though I've owned a number of Android tablets and smartphones (including every Nexus device) before. Every server system I've deployed in the past decade has been to Linux.
No Apple stormtroopers ever busted down my doors and demanded compliance. Never did I feel the need to wave a flag or commit to a tribe, because why would I? The notion is self-sabotaging.
In a word "integration".
Can you use an ipad as a second screen/wacom cintiq? On mac it is built in. On windows you need extra software and then it is still ducttaped on.
Linux desktop and android phone? Then you can reply to sms, share the clipboard, automute music when receiving a call,... (kdeconnect. MS and Apple have some decent copies). What if you have an iphone instead? Well, bad luck.
All these things might be "minor" or "only by default", but they matter very much to many people.
Hardly headline news - a $1,700 camera with >$1,000 lenses is better than a $1,000 phone with a camera attached. If anything the gap is closer than it has ever been. Irrespective, there is litterally nothing stopping someone from having an iPhone and a dedicated camera, or if there is, then I (and practically everyone I know) are breaking some sort of rule somewhere dictated by the overlords at Apple HQ that thy shall not use a camera.
The original assertion was same price, so a $1,000 kit body + lens is better than a $1,000 phone and camera.
Which doesn't change anything about how inane a point it is.
I think that depends on what you mean by "better job." For example, I have both a Nikon D-800 and an iPhone. The Nikon takes amazing pictures and the iPhone fits in my pocket. Most of the time, the "better job" I want is a camera that fits in my pocket. On occasion, the "better job" I want is a camera that takes amazing pictures.
I would never abandon either my camera or my phone because my "better job" frequently changes.
Problem is that there are so many different people, and so many of them don't even know how to operate a device that isn't completely pre-prepped for the lowest bar to entry that they can never be as cool. /s
I guess this is the kind of data point that people might use in reinforcing their bias for a choice they are about to make. If you take two brands, platforms, systems etc. and just search for switchers between those you'll find the exact result you'll like.
Going from Debian to Arch? Amazing! Debian sucks for reasons X and Y and Arch is much cooler. But switching from Arch to Debian? Hah, those Arch losers are missing out on A, B and C so they are so uncool! Heck, you can make this even smaller. Using OpenSans as your Font? Boo! Use Fira Code, that makes your code so much better! Pick any microcosmos and you can find migrations in all directions.
But the older I get, the more I find I’m just happier in the Apple ecosystem. I don’t want to fiddle with X11 settings or tons of dot files. I just want to open my computer and hack on the shit that interests me. There’s things about Apple that annoy me, but the same can be said for Android, Windows, any of the free *nixes, etc.
Every platform and ecosystem has trade offs and it’s fine to just use what you want to use. Personally I’ve got Apple laptop, phone and tablet, a Windows desktop for games, several RPis running various Linux distros, and a Proxmox server running VMs for infrastructure and tinkering. None of that makes my farts smell better than anyone else.
… except for those nasty Emacs users of course :P
'I needed windows for some work stuff, so I went onto dell and bought the thing in the middle of the price range.'
Yes you can hit CTRL + left arrow or right arrow multiple times but... once one gets into a workflow that allows them to hit Super+2, it actually feels like a pain to do something else.
You apparently didn't read the article properly. The author's major grouse is how he recognized slowly that Apple is very controlling about the user experience on its devices and how this is a huge limitation to do anything else "outside" of Apple's "thinking" of how a software or hardware should be used. And how Apple's product manager's have lost sight of what really adds value to the user experience, and their software and hardware choices of new Apple devices are now more dictated by their own greed / ambitions.
From the very examples you cited, the author's emphasis thus was on:
> ... but I am back in control. I am more productive.
> Wired headphones never run out of battery and have superior audio quality ...
> I can take real photos wiht high quality ...
> I have fun again discovering bands and artists on Bandcamp instead of mind numbing listening to Apple Music playlists
> ... but also my OS is way more productive and faster
So it's not just about what you can or cannot do on a Mac, but how the author has found a better way to do all this outside of Apple's limiting ecosystem, keeping in line with his new beliefs that Apple no longer cares about users like him. And I fully agree with him and share the exact feeling (I feel Apple says a "F*k you" to me everytime I want to maximise a Finder or Safari window because the "Apple way" is that you are only supposed to make them fullscreen or vertically maximise ...)
There's FLStudio (or Bitwig, or Reaper) to match Apple's Logic, 1Password (or Bitlocker from what I hear) to match Google's passwords.google.com, and Backblaze+others to match Time Machine
Sure 'switching to apple' is a choice, but only one of many. Personally I am tired of using tech designed for and by managers and SWE's gunning for promotion and not personal empowerment or passion for personal computing
Besides, everyone needs to ditch Chrome. Firefox could use the love
BB complements TM for me; it's super useful to have a local versioned backup that's updated every time I dock my laptop with the desk, and is also invisibly supplemented by a versioned backup of changes I've made that the OS maintains on unused space on the laptop's drive. Works even if I'm sitting at a cafe with all the radios turned off to get maximum life out of the battery.
Backblaze? Backblaze is there for in case my house burns down and I leave with nothing but the clothes on my back.
For everyone else, sure, there might be 100 people in the world that will actually audit their open source password managers. But that isn't exactly moving the rest of the industry forward (be it from the engineering perspective or the user perspective).
In this case, (almost) perfect is the enemy of good.
Unless you have spoken to all managers and SWEs I doubt you can make such a blanket statement about the immense workforces at Microsoft, Google and Apple.
Sure, one could argue that Microsoft is profit-driven, Google might be marketshare-driven and Apple might be UX-driven, and all of those could be true (or at least true for a major subset of both the engineers and the targeted user group). But it would be a guess at best and not helpful to your personal situation or anyone else's.
Personal computing is a box under your desk or in your pocket beefy and smart enough to solve your problems without checking in with daddy warbucks and the mothership.
Nearly everything about modern cloud computing is the opposite... it disempowers individuals and creates an unhealthy relationship with things that are out of the users control. How many peoples' livelihoods have been messed up because FANG decided to lock them out for some reason or another? How many articles have been posted to HN and elsewhere begging for a FANG employee to come along and fix their problem because there is no other option?
The server in my closet or at my datacenter isn't going to lock me out because I pissed off Google.
In practice, I am continually having to disable cloud-related features (worded that I need them in the most patronizing way possible) every time Windows updates
In practice, common knowledge on desktop computing is atrophying because of an increasingly acute lack of knowledgeable users on the internet. As more and more of the young users who grew up in this gilded cage enter a level of expertise, their advice pollutes forums with cloud-backed solutions and knowledge of 'the old ways' becomes harder and harder to find.
Because it lets people - such as my older relatives - who wouldn't use a desktop and would have problems setting up and configuring one - especially for an "advanced" use case such as setting up your own private cloud.
As a computer instructor back home in west Africa, before smartphones I had students (especially older ones) who struggled with even using a mouse or keyboard. I've never had to teach a single one of them how to use their smartphone (at least for things like browsing the internet, taking and editing photos, making notes, etc). Providing things like photo editing on-device, without having to download and figure out a program like photoshop (or image magic for example) is a non-starter. Now they can edit, filter, share, make home movies- all without needing any help.
>> Where is the passion for 'personal computing' when nearly everything personal about it is delegated to the cloud?
I don't know what you mean by passion, but these people love their phones and the things it lets them do, with an intuitive UI (as well as a transferable one between Android and iOS, since they're about the same feature wise at the moment)
>> Personal computing is a box under your desk or in your pocket beefy and smart enough to solve your problems
This is ONE version of personal computing, and tbh an option only available to us "hackers".
>> without checking in with daddy warbucks and the mothership.
You don't need to check in. My country has crappy bandwidth and of course it's impossible to do things like pay for iCloud or gdrive without credit cards, so most use the phone without backups, and they're fine.
>> Nearly everything about modern cloud computing is the opposite... it disempowers individuals and creates an unhealthy relationship with things that are out of the users control.
How is allowing my students, older relatives, etc - who wouldn't be able to use a laptop or desktop machine - to go online and talk to their children abroad using apps like whatsapp - disempowering? I understand your point - I miss the days when we had computers to mess around with and learn Linux and coding on. Most of us on here came up that way, compared to kids now who don't have to see the underlying OS or tinker with it - they can just play Fortnite and watch YouTube.
But most people aren't interested in these things, or even care about machine models, OS versions etc. They buy phones because they want to communicate online and talk to their friends and family. That's it.
Before whatsapp the only way I could talk to my family was through international calls, which are prohibitively expensive, so they would happen about once a month. Once they got on whatsapp though we moved to being able to talk, share videos and pics of things happening at home, every single day. In addition a large part of getting rid of our dictator was people being able to use apps like whatsapp to share the latest information, even as he cracked down and only allowed propaganda to be published even in private newspapers.
This is a net positive however you look at it - it has brought far more people (especially in poorer countries) into the digital age. I find that very empowering in the sense of putting tools in the hands of more people, including barely literate ones (my country has something like a 40% illiteracy level rate) who use things like whatsapp voice notes to communicate.
>> How many peoples' livelihoods have been messed up because FANG decided to lock them out for some reason or another?
This is true but not really related to your larger point about computing devices and how they are being used now.
>> The server in my closet or at my datacenter isn't going to lock me out because I pissed off Google.
This really doesn't happen that much often considering we're talking billions of users.
We are on a "hacker" site where most people are skilled "tech people" so of course when things break for them they write blog posts and comments and we get to see and argue over those. Most people using these devices don't know or care about any of this, and have gmail accounts going back a decade that they've used without any issues.
In summary your version of computing (which is similar to mine) just isn't the universal version. I always find joy whenever I go back home and find taxi drivers using their phones to play music, people using YouTube to check instructional videos and post their own etc. When my grandmother died my uncle gathered all the photos, videos, etc that we have of her as a family on whatsapp, and made a nice video tribute of her life to share with us. Gathering the photos and videos took longer than making the movie itself, which was a few taps, and it was a very powerful moment emotionally esp for us family members outside the country. We were able to participate in remembering her life together in a way that international phone calls and a complex (to my uncle) photo editing app on a PC under a desk would ever have let us.
I find this very empowering, and it makes me happy how far we've come and how many more people we have brought across the digital divide. Let's not be myopic because those of us on here have the technical knowledge to disdain and even dump these platforms.
And keep in mind that American / Western users' use cases are very different from the farmer who can only afford a cheap android phone, being able to come online finally to check on crop prices so they won't get cheated, send pictures of their crops to prospective buyers, and even do things like checking the weather.
It's the tyranny of the "minimum viable user" . By making information systems that are safe enough for your students or their grandparents to use, the companies have to take away the customizability, configurability and assorted "sharp edges" that make those systems useful for more advanced users. I say "have to" deliberately, because the counterargument of, "Why can't they make both," never seems to describe a real world system.
Whether its Windows, MacOS or heck even Gnome3, the more an information system attempts to cater to the needs of novice users, the worse it becomes at catering to the needs of advanced users. And the terrifying reality is that inexperienced users outnumber us hackers by two orders of magnitude (or more).
Some points to back up my contention:
- Microsoft does not allow the Windows serf to uninstall Edge on Windows 10.
- Microsoft is hostile to the idea of allowing a Windows serf to self-sign their own TPM.
- Microsoft does not allow the Windows serf to inspect the code that arbitrates their computing.
- Microsoft does not offer the Windows serf any ability to inspect, reject, or roll-back "updates."
Understanding these harms helps my personap situation by enabling me to make informed decisions about avoiding serfdom to Microsoft and their legion of cog-like engineers, which is definitely to my benefit. Microsoft engineers actualize the harmful policy of "Windows users deserve less;" therefore, the sniveling SWEs at Microsoft deserve less.
Part of that less which I shall forever withhold is my money and my endorsement of their basic capacity to ethical computing. Microsoft engineers are simply not respectable with regards to any ideals of user empowerment they may pretend to hold.
Being able to buy a new Mac and make it into an exact copy of the old Mac by using time machine is amazing - I’ve done it a few times over the years, and it still impresses me every time. Plus I have time machine backups from 2011 which I occasionally browse to see what my kids were up to then.
You are a different target market, always check for silly normie settings
Apparently Dropbox notices they already have that file, and instead of you uploading it they just make it appear in your account.
ISO files are included in that filter list, among other files like VMs and such. Seems like a massive exclusion net to cast when claiming everything is backed up.
Removing these exclusions is very easy and obvious.
It should be used in concert with an offsite backup service like Backblaze.
Can you link to a step by step on this - I think this would help a lot of folks.
Check out his YouTube videos as well, for a more visual walk-through.
File history is for all your files, but it is amazing at it. You can roll back to a particular version from particular date
Plenty arguments against Logic, but these three are not in the same league.
For other workflows, Reaper and Logic can do things that are essentially impossible in Live/Bitwig/FL Studio.
There's no best DAW for everyone, only best-DAW-for-the-workflow-you're-using-today. Sometimes that's Logic, sometimes it's not.
Not in live: logic's environment (you could do this in an M4L object, but not live itself)
Windows/Android can be considered a natural pair, Microsoft seem to think so. Personally use Linux/Android pair, and I've been vaguely considering switching to an iphone to avoid Google.
One of the sad ironies of my life is that once I switched to macOS and could’ve avoided all the annoyances and instabilities I’ve had with audio on Windows, I just never felt like making music again. Life is like that sometimes. :)
Suffice to say I think all of the software in your list is pretty great and I chose what I chose for reasons that are wholly mine!
It feels like everything is overflowing with mediocrity, and there's no hope of it ever changing. I often have an idea for improving something that I see as being extremely shitty and perfect for improving/innovating, but there is no possible way for me to turn it into a successful product/business because the monopolists wouldn't allow it.
For example, why is the app store experience that same boring crap it has always been? Why can't that be fun, with like user profiles, Amazon-style reviews, Steam-style community features, social media-style feeds so you can sub to a developer, TikTok-style feeds of promo videos to discover new apps/games, etc. There's so much to explore there, but it's never going to happen because Apple doesn't allow it, and Apple themselves isn't going to do it because they're a monopoly and they have no incentive to (same goes for Google).
What are the chances that our government will wake up and do something about this limp noodle we call a tech industry? Or that some startup will "disrupt" these monopolists with something better and resist the multi-billion-dollar acquisition offers (backed by anti-competitive threats).
This makes me want to grab my old thinkpad and disappear into the woods. I'll become a hermit, using only my own+libre software and only returning to civilization when I need to replace the battery pack.
Similar to the other comment, I read that with sheer horror. I specifically don't want to spend a lot of time in any app store. I want to find quality apps I'm looking for, download them and leave.
I’d definitely think that about the Apple Store if they introduced some of the features that you suggested — that’s part of the problem. What one person considers an obvious innovation another person considers yet another newfangled UI design.
Apple Store already has star ratings, user reviews, and links to publisher web sites. I don't use any of the social/community features of Steam, and I can't stand the amount of review-stacking that happens in Steam and Amazon.
The one area that Steam is useful is their discovery stream where Steam will show you more games that they think you might like — the catch being that if you like one RTS game, Steam will show you lots of RTS games and never show you Mass Effect, Torchlight, KSP, or Dad Dating Sim.
As to why Apple doesn't have a discovery stream, just look at Steam's problematic system and you'll understand: Apple doesn't have it because they haven't figured out how to do it in a way that makes sense and will help people discover apps that they'll enjoy.
There the Apple Store suffers is in discoverability: in some cases I've been looking for a specific app and the thing I'm looking for comes "under the fold" because a dozen other apps have paid to get higher rankings for that search term.
I do not want TikTok-style feeds of promotional material clogging up my App Store. I want to get in there, find the thing I want, then get out. I do not spend my life looking for inspiration from apps in the store.
If you need a low cost vendor neutral DAW, Reaper is the way to go. I'm a semi-professional musician, and well compensated software architecture consultant, so cost is not a factor in my software purchases for music, but when I want a linear-paradigm DAW, it's Reaper. That product is dope and the company is awesome.
iOS and MacOS both have endless UI changes as well.
For what it's worth: if anybody else is facing the same conundrum, choose an OS-agnostic DAW like Reaper, Renoise or Bitwig, then it doesn't matter what OS you're on.
Running Affintiy Designer under Windows 10 VM is non issue.
OS-agnostic path is the only long-term way of working.
Now under Linux I use Resolve, BitWig and Reaper, Blender, Inkscape, Krita, Emacs and I feel confident that no corporation will dictate my workflow in the endless pursuit for world domination.
The reality is that we are living in some form of economic stagnation, worldwide changes are ahead and people will be poorer, this will drive Apples vision for 1000 dollars + smartphones and overpriced computers out of place. I have converted my business to ARCH and in the process saved a ton of money (previously dedicated to buy Apple Silicon).
Now this money will be spend on my engineers not on some half-assed marketing plan from company out of touch.
The only software that kept me inside Apple Ecosystem was Sketch. Since pandemic started for good or bad Figma is taking Sketch market-share, I don't like SaaS for design paradigm, but we adapt to clients requirements not to our own taste. I used Inkscape for interface design long before Macromedia Fireworks to offer a specific solution. So I am set for the future.:)
> I realised that my life while using Apple products is
> controlled by Product Managers/Owners who want to get a
> raise, rather than by technology people who share the same
> passion as me
Mainly because I feel exactly the opposite. I don’t find Windows, Linux or x86 technologically exiting anymore.Apple makes (IMO of course) the most technologically exciting CPUs, their GPUs are a breath of fresh air, I love their approach to UI, APIs and OS security.
Instead some things that have every day practical relevance for me:
- read-only mounted system volume that prevents me from doing stupid mistakes and accidentally sudo rm important files
- application certificates
- all executables being code-signed to prevent tampering
- built in zero-knowledge password manager with automatic synchronization across all my devices
- full hardware insulation for DMA devices on Apple Silicon
What are you talking about? You have nothing to hide?
You are so addicted to Apples "solution" that as a tech savvy professional you cannot "countermeasure" against.
Practical? What is practical in knowing that your trusted computing device is easy target? Did you not understand that "power of association" with high-profile target by false positive will ruin your life forever? And this "automated" processes will be "included" in all commercially viable OS's.
IDK but sticking with FOSS and practical knowledge looks like better solution than trusting "whatever magical <Big Corporation> tech.
But who am I to question the Apple fetishists of current day. After all in the past I was in this camp. So be happy with whatever "rationalization" you come up with. Everything is fine and dandy, the Big Apple is taking care of you.
Because of availability of control surface, absent in Windows and macOS and most importantly the Kernel access. Yep, there are security problems in any os, but to compare custom build Gentoo with your beloved Apple toy, please grow up:) Your argument is funny and nonfactual.
To trust that a company witch publicly cooperates with oppressive governments and creates on device scanning software , breaching all "privacy" promises, helping with your security is absurd.
I have used exclusively Apple computers since early 2000s, and can pinpoint the moment in which all that I loved ended.
The moment when the iPhone was born.
Since then fighting with Apple telemetry was "business as usual", the existence of Little Snitch is all the proof that you need.
Nobody said Apple's security is perfect. People's views are simply biased because when Apple's much tighter security is breached, of course it will make headlines everywhere. Windows and Android have far more malware, but you won't see headlines about it every single day.
Either way, unless you're a journalist, politician or some other high-value target, you're pretty unlikely to be targeted by exploits like the NSO one. But if you don't care about jailbreaking, you can still update your device to give yourself peace of mind.
That has changed massively recently, my sense is rough parity currently (on a clean android phone).
Payouts are still a lot higher per point marketshare for iOS though, so maybe still harder to develop.
Android is about $30K per point market share and iOS is around $130K per point market share (4x more).
For example, it's not that Exchange / Outlook are HARDER to get an RCE on that makes the payout so high for those, it's because they have a LOT more usage than something like postfix etc.
I hate how closed-off a lot of their ecosystem can be, and the obvious "I prefer open source" that every engineer says, but I do honestly (and sadly) think that macOS is the least-bad *desktop OS out there right now.
But any time you are emulating one complex low level API with another, there are likely performance penalties. Not to mention you are raising the complexity of the software stack and adding more surface area for bugs.
It would be better if there was an open source low level graphics API that every OS could support in addition to their own proprietary APIs focused on ease of use.
Funny enough, that would be Vulkan, right?
Note that WebGPU has been heavily inspired by Metal (AFAIK), and it uses translation layers on top of a number of APIs (whatever is available on the system). And it's not limited to browsers, despite what its name might suggest.
So it seems like your best bet to try something Metal-like that works across platforms.
I'm not a graphics programmer, but whenever I've gotten the itch, I've always gotten huge headaches trying to get Vulkan to work, and had basically no issues at all using Metal.
- TBDR with user-programmable persistent GPU caches allow you to do some really cool things smartly, drastically cutting down the amount of work and memory fetched
- GPUs are trivially exposed as what they are: machines with very wide SIMD ALUs
- resource binding graphs with full support for pointers and indirection; resource bindings that can be created and populated on the GPU
- sparse resources that actually work and are performant (nobody uses them on mainstream GPUs because they are apparently slow as f** there)
Pretty sure it was before the millennium when computers became cheap while the value of repair skills increased, it was no longer cost effective then to have the old malfunctioning computer of any make repaired. I expect you mean that you want to repair it yourself, but if you are skilled enough to do that, your time is valuable, so it hasn't been worth your time for at least two decades to spend even 10 hours a year repairing a machine more than about 3 years old. And AppleCare is worth the cost of extended warranty, so you don't have to repair it.
> Apple Music can't even play FLAC files
That's just that software. VLC runs on Apple Silicon, so you don't need to use Apple Music, but if you did and want to play FLAC files, you'd need to transcode them to ALAC, Apple Lossless (or mp3 or aac for lossy). You could build and script ffmpeg to transcode all your FLAC files to ALAC in one CLI command, because lossless is lossless. I'm kind of curious just how fast Apple Silicon will do that... seconds I bet, because it takes only a couple minutes for my Core2Duo to transcode a full-length album of FLAC to ALAC.
The median hourly income in the United States is $20/hr. Buying a new computer every three years is certainly going to cost you more than $200 a year, unless you buy extremely cheaply. And that's leaving out the massive environmental externalities of not repairing older equipment rather than throwing it away. Also leaves out the fact that most people have free time that they can not trivially trade for more working hours.
Furthermore, 10 hours a year seems ludicrous. Apple bricked my MBP with the Big Sur update (along with many other people's laptops, it was a bit of a scandal). I repaired the rather infamously not-repair-friendly Apple device in about half an hour with a part I bought on Ebay for less than $10.
> I expect you mean that you want to repair it yourself, but if you are skilled enough to do that, your time is valuable, so it hasn't been worth your time for at least two decades to spend even 10 hours a year repairing a machine more than about 3 years old.
So your claim about the cost of onsite repair is completely irrelevant. We're talking about the cost of repairing something yourself, which you very frequently can do, even when you're working with something as unrepairable as a Macbook.
I replaced the battery in my X1 Carbon using a screwdriver in about 15 minutes. No significant downtime or hassle.
> You could build and script ffmpeg to transcode all your FLAC files to ALAC
Hardly "Just Works" though is it? I might as well stay on Linux...
If you run linux, sooner or later you're going to want compile software from sources. Building software on linux is hardly if any different than building software on any other *NIX platform. Developers and package managers have made the process easy. So whatever your pleasure, please indulge.
Which would suggest that Apple's glued-in batteries are a bad idea? Whatever one chooses to call the process of changing a battery, this is unnecessarily difficult with Apple hardware.
In fact, I feel like my freedom is more practically restricted on Linux, with its' general hostility agains distributing compiled code.
If this tread is showing reality of the level of critical thinking by the tech educated people, we are living in 1984 in full swing.
Attacking the author for sharing personal decisions which have factual data, disregarding arguments with personal "ease of use" points. Just totally out of touch and reality.
And when I think that most of these people are producing the software solutions of today - OMG.
I already cannot fathom how we dealt with the URL bar at the top of these big phones for so long. It's been a long time since a major UI change in some popular software actually made me happy, let alone this ecstatic. I'm usually one to complain about unnecessary change (like Firefox's recent UI refresh)
It's not at the bottom, but within the content viewport. Browsers need a dedicated "safe space" for notifications and symbols like a lock for HTTPS. A floating address bar over the content of the website could easily be faked using HTML/CSS/JS and abused for phishing.
Sure, put it at the bottom, but not as a floating pill over the content.
'Next/Apple' isn't a quick playbook - it's an over 30 year R&D effort to create a hugely complex software and hardware business, and it spent about $100 billion in R&D to get its products where they are today (at the absolute cutting edge of technology). Writing your own modern OS and building/manufacturing good hardware to compete with this is difficult enough, and then you have the even bigger challenge of getting all the major software vendors to support your new platform.
But isn't this much, much easier if you just piggyback on the Apple hardware ?
I always expected this to happen.
Circa 2008 or 2009 I thought that any day now there would be a linux distribution built specifically foe one single Apple laptop. No hardware issues, no gremlins, no moving targets - you would have a (very) fixed hardware target and optimize just for that. Then I, as a user, could just go to the Apple store and buy a nice shiny device and install MBAlinux on it and call it a day.
I really don't understand why this never happened. Further, in many ways it seems that the opposite of this happened - installing linux/FreeBSD is weirdly painful on Apple laptops which is unexpected since we all know what is inside of them and the installed base is huge.
So I would suggest that you could, indeed, build a hardware/software ecosystem - just let Apple build the hardware part ...
Lets take Ubuntu as an example. Today you can get Ubuntu laptops that will work out of the box. Is that true tomorrow? Absolutely not. The next distro version will break something in the hardware. I have been burned by this twice now. At then end of the day the Apple premium is not really a premium. It ensures that they continue to support their legacy hardware for years. The people who bash the premium as some sort of "idiot tax" are actually valuing the software that runs on the machine at 0$. There are too many people in this world that don't understand how much effort it takes to create and maintain good reliable software. You see it on the app store where people can't fathom spending 99 cents and you see it in the bashing of Apple devices.
Lets assume that your hardware works beautifully with the current version. Then you actually look at the apps shipped with the distro. They are poorly made and do not form a cohesive OS. You are forced to hunt for other open source equivalents to basic stuff like "paint". Have you tried using the calculator or notepad equivalents? They suck compared the simple and easy to use Windows and Mac equivalents. This is something even Windows gets right. It comes from the fact that Canonical does not have the resources to build each app around a unified design and UX principle so they farm it out to the "open source community".
Finally, why do each distro version seem to break something on the same hardware year after year? There seems to be a serious lack of regression testing on these distros. For 10+ years I have witnessed how one version of Ubuntu breaks some stuff, fixes others and then the next version fixes some stuff but breaks previous working items. Then it gets worse, the subsequent version breaks previously fixed stuff again! I am forced to QA the entire OS every time a new release comes out and hope I don't miss something(which I always do)!
I did try and use Linux full time but the UI drove me up the wall so I’m back comfortable in macOS but my dev work flies on that Linux box.
Isn't it a bit on the nose that you accuse Linux of failing at the one thing that Windows is notoriously bad at, UX cohesion?
More seriously -- Linux reflects a different mentality and way of doing things. It is not for everyone. Downloading the software you want is the expected way to do things. I have no idea whether Ubuntu ships a "paint" replacement, but regardless, Pinta or Paint.NET are like three clicks away, thanks to the software repository approach.
Linux is more or less for people who want to experiment and configure things their own way, and make software that solves their own problems in the way that they want those problems solved. Creating a single, opinionated, out-of-the-box working desktop experience with perfect hardware compatibility with whatever bullshit proprietary-blob using silicon is out there is (a) hard, and (b) not what most Linux-using developers are interested in.
The people who use Linux largely recognize that yes, it is a compromise, but also that using Windows or macOS also represents a compromise. Having used Linux for nearly 15 years now myself, I can say confidently that the trade-offs for me weigh heavily in favor of Linux.
Yeah you can criticize Windows for trying to update their designs with Metro and the like but in reality, all the old apps that worked cohesively are still there even today. Ubuntu and the Gnome or KDE based distros never had this to begin with. Just multiple flavors of the same cruddy base applications since all the distro are using the same apps anyway.
>More seriously -- Linux reflects a different mentality and way of doing things. It is not for everyone. Downloading the software you want is the expected way to do things. I have no idea whether Ubuntu ships a "paint" replacement, but regardless, Pinta or Paint.NET are like three clicks away, thanks to the software repository approach.
Yeah thats fine but that unfortunately makes it a non-starter if you are looking for a direct replacement for macOS or Windows.
Yes Paint.NET/Pinta/GIMP are always trotted out when I post this example. Pinta has been an unstable mess every time I have installed it. Plus "paint" is a near-instant loading app that is several MB in size whereas Pinta is installing loads of supporting libraries because it is a more complex application. Your telling me that in 2021 they can't just ship a simple app to allow a user to just jump in and use to resize images or add some text to basic images? This hinders the usability of the system when I can't just quickly do a simple task and move on! It is as if the developers of this distros have never understood how a regular user uses a PC.
>The people who use Linux largely recognize that yes, it is a compromise, but also that using Windows or macOS also represents a compromise. Having used Linux for nearly 15 years now myself, I can say confidently that the trade-offs for me weigh heavily in favor of Linux.
The only thing that is a given is that any comment bashing Linux will ultimately attract someone like you that tries to twist and turn my words to justify it. I've seen it for 10+ years now without fail so i'll leave it at that.
Its a shame because I have looked at the messy bug tracker for Ubuntu and have tried to fix issues but then I stop and realize what is the point when it breaks again in some subsequent version of the distro. I wish someone would just dump a bunch of money, hire former Windows/Mac devs and properly build a lot of the supporting components of some distro, then all the other distros can roll up those better apps and then we have at least something that can be called adequate in 2021.
The only thing that's a given is that any post that proclaims the relative merits of Linux versus alternative operating systems will immediately attract posts like yours bashing it, so shrug.
> I wish someone would just dump a bunch of money, hire former Windows/Mac devs and properly build a lot of the supporting components of some distro, then all the other distros can roll up those better apps and then we have at least something that can be called adequate in 2021
My point is not that you're wrong to feel this way, but rather that you should recognize that "adequate" is ultimately subjective. Adequate for whom? Adequate how?
The Linux ecosystem is largely designed by and for people who are willing to tinker, willing to customize, who want to design software that scratches their own itches, and who aren't looking for a perfect out-of-the-box experience from a distro. A handful of people want to bring about "the year of Linux on the desktop", but they're a minority and even for them the interest is usually secondary to their own use of Linux.
There's no "twisting your words" required here. What you want is a near-perfect out of the box Linux experience. What most Linux users want is ... something else. My point is simply that that's okay. Linux doesn't have to be for everyone. Your problems with it are not everyone's problems with it. In particular,
> that unfortunately makes it a non-starter if you are looking for a direct replacement for macOS or Windows.
Most Linux users don't want a direct replacement for macOS or Windows. Maybe there's a class of "theoretical Linux switchers" out there who would switch and would be the majority of Linux users if they did, but they are not, at present, the majority of the people using and working on Linux.
What Linux provides me with is (a) a well-integrated package manager containing fully free/libre software, (b) a comprehensible system (where I can understand fully how each part works), and (c) a modifiable system (where I can change how the system operates to the extent I want). Having a perfect replacement of the MS Paint application is not even on my radar. But that said:
> Plus "paint" is a near-instant loading app that is several MB in size whereas Pinta is installing loads of supporting libraries because it is a more complex application.
Maybe you're exaggerating, but on my system Pinta has an installed size of only 2.88 MiB and has only two direct dependencies. Maybe you're thinking of the fact that it's written in Mono (the C# runtime), but that's a shared installation with all other Mono applications. It's equivalent to Windows shipping with the .NET runtime or UWP.
Sure, it's easier, but then I'm not sure what the point is or what makes it one of the biggest opportunities of our time.
I also understand why it never happened - there is already a unix-based OS which is designed with perfect compatibility with the Apple Hardware called OSX! I'm not sure what the advantage to a consumer would be for replacing OSX with linux - other than the fact that it gives consumers choice - but of course providing a distro that only operates on a specific Mac is then limiting hardware choice so it doesn't really solve that in some respects.
And if it's just for developers, then wouldn't developers want some choice of hardware, good support for tooling, the ability to test native apps without virtualisation e.t.c.
IMO I suspect the Venn-diagram of developers who:
* want a Mac but don't want OSX
* don't mind that they can't upgrade their hardware
* are willing to run some totally-new operating system
* Accept that it will initially lack the support of the runtimes they use, and some software, and won't be able to develop certain types of software because of this.
* Accept that if they wish to continue using the OS for their next laptop they will be fully locked-in to a single hardware model.
is pretty vanishingly small.
Apple doesn’t prioritize lack of binary blobs. The EFI firmware is all proprietary. All their Wi-Fi have been switched to Broadcom.
They do weird non-standard things to the Thunderbolt controller, e.g., you have to lie to the firmware and claim to be macOS in order for it not to disable the Thunderbolt controller.
Newer MacBooks hide a bunch of hardware behind the proprietary T2, and whatever embedded OS runs the Touch Bar.
MacBooks are not ideologically pure, and sunk efforts to get an OS working on other machines are often wasted on MacBooks because Apple does things in bizarrely different ways.
I only ever need a laptop when traveling, I have a big desktop setup at home. I plan to take my Steam Deck traveling with a portable monitor and keyboard.
Would that be even legal? I mean selling a commercial OS that would be marketed to install as a replacement OS on the most locked, most proprietary hardware on the market?
The problem with this is which developers? People who write embedded systems? Web developers? People who write custom Windows applications?
Any given developer subset is likely to find this hypothetical new developer computer to be either too complex to use or not differentiated enough from Windows or MacOs (or ChromeOS).
> Most of the software we use to create software are electron apps
This is not true for most people whose primary employment is writing software, or working on software teams. Most people who get paid to write code work primarily in either the Java or .NET ecosystems and use something like Eclipse, IntelliJ, or Visual Studio. (Many more are using niche-specific tools in a captive platform like Oracle, SalesForce, SAP, etc.) If the new platform doesn't have 100.0% binary compatibility with legacy tools written for Windows and/or MacOS, its addressable market shrinks substantially.
It's critical to have a great developer story yes, but to make a stellar platform that needs to be balanced with a great user story, and that means developers might not always get everything they want down to the letter.
VSCode is not the entire stack needed to build Windows desktop applications. There are tens or hundreds of thousands of developers who build applications for the Windows desktop. I'm not (for the most part) a .NET dev, but my current understanding is that only Visual Studio running on Windows is a first-class citizen with the ability to access all parts of the dev stack. The Windows dev stack doesn't need to move away from native Windows applications any more than does Xcode need to move away from MacOS.
Why would that be an opportunity though?
It would be a low margin niche, with a small market segment, of which most would stick with Apple/Lenovo/Dell.
If you wanted to create a new platform, your best option would be to go the other way and make an OS that was “just electron” and ran all the electron apps in the world faster and better than anything else. Unsurprisingly Google has tried this with Chromebooks, but their track record on consumer product development is so poor that perhaps they just didn’t execute well and someone else could pull it off.
Another challenge is that if you did that you gain wide software compatibility but you lose any obvious differentiator. The likely way to win would be if you could make a laptop that was “just as good at running web apps as your Mac, with just as much battery life and just as nice hardware” but somehow cost under $400 or so.
I actually wouldn’t be surprised if we see that coming out of Chinese OEMs in the next decade.
Screens are the most expensive things on a laptop, mobile and tablet.
So you are not going to get a good screen
If they need a photo manager, in my experience the most common application need after a web browser, then Digikam really cannot be beat.
I am a full time Linux user. And I'll probably support anyone who wants to try it until the day I die. I absolutely love it. But we still can't enjoy some of the simplest use cases without screwing around with configs and in some cases, writing scripts that listen to DBus or udev.... So every time I hear someone say, "just use Linux" I think... nah, just buy a Chromebook (and - yeah, use Linux). If your needs are any more than that, Linux might not be for you.
> writing scripts that listen to DBus or udev
Exactly. Not to mention the preceding step of spending 30 minutes in forums to find someone else who has had this problem on a system with exactly the same motherboard so you don't try the things that didn't work for them.
I'm still 100% Linux on the server, headless Linux doesn't have nearly the warts as GUI Linux.
> Want to use bluetooth? It might work.
Sounds like the problem of installing Linux on a hardware designed for Windows. All those things work flawlessly on my Purism Librem 15, which came with preinstalled Linux. (Ok, I did not try Bluetooth, but saw reports that it works.)
In the context of the original comment, though, this does narrow down the market that can be addressed by a company running the Next/Apple playbook.
It seems vastly cheaper to solve these issues than to start a new company to produce its own hardware and OS.
If you did start such a company and prove there was a market, then you're making a bet that Canonical (or KDE devs) won't put you out of business.
Sadly, this is now true for Macs as well. Were by "not work" I mean: not finding/not supporting the proper resolution and/or refresh rate for the display.
I never succeeded. I just use the Macbook with only the built-in display now.
Here is one. Randomly, based on no relevant input from me or changes in the laptop's state, my network connection dropped and the Network Manager UI was telling me no network adaptor could be detected.
Some fumbling around in the Terminal (including various reboots not solving the issue), and managed to enable the wireless adaptor which apparently could be detected and connect to my network, though at the same time the UI was telling me in no uncertain terms that no wireless adaptor was connected to my laptop.
Then later, again randomly, based on no relevant input from me or changes in the laptop's state, the UI agrees there is a wireless adaptor connected after all. This is on a machine currently in near-factory state with certified compatible Ubuntu preinstalled.
I share this example because one can at least comprehend why random monitors or graphics cards or what not do not cooperate without fiddling, can comprehend certain apps failing and crashing, can comprehend other unusual bugs. The UI thinking and acting like there is not a network card for no reason whatsoever, on the other hand, is completely illegible to even competent users.
Someone needs to just commercialise a proprietary and at least initially closed version of Linux (so as to to turn a profit) with good design principles in mind and deal with lawsuits and license issues later. There is plenty of money in it.
As for the barrier, it is in the software and app ecosystem.
My point with the software is that most software people use to create software/apps is increasingly created with web technologies / electron apps, so the native app ecosystem on a desktop machine is increasingly weakening as a moat.
- Unix-like, developing on Windows is pure torture
- Don't waste my time with configurations, drivers and other crap, I want a machine that I can be productive with out of the box. Take my money if you have to, but I don't want to edit Xorg.conf ever again.
You can talk about polish all day, but no machine that doesn't satisfy both is even close to appealing for a majority of developers in my experience.
You are right about the native app ecosystem being less of a blocker, but that's in line with my point.
For how much more would you siphon money into basically empty stores, to see it as having gone "100%" of the way?
Which is even more damning of Microsoft / Google management.
The sunk cost fallacy is not really about what the venture would actually do. One could be said to have fallen prey to it even if they double down and the venture eventually succeeds.
What's important is that at the time of the decision (a) the path doesn't seem to be working, and (b) they think "but I've spend too much to quit now".
This is more likely when one assumes it can still has a chance to succeed, than if they assume it will inevitably result in failure. Nobody that assumes inevitable failure would decide to continue.
All I know is at this point Microsoft had two options: continue investing into creating an alternative to Apple, or cancel their plans and sit back and let the Office/Azure revenue flow in.
Maybe it was a long shot, maybe they decided the size of Apple's customer base divided by two was not enough to satiate them, but whatever the case, they signaled that they do not have the talent/gumption/appetite for risk to pull it off. But if any company did have the opportunity to go for it, I would think Microsoft (and Google) with their income stream would have been in position to do it.
Both companies seem to dip their toes, but never follow through.
I would have liked to see it succeed if only for there to be more competition.
It also had a great 'copy and paste' feature
This hits hard
The prerequisite of being a "large tech company" to pull it off is already a huge barrier to entry.
And physical hardware kill switches? This almost seems too good to be true.
It will not (yet) replace your iPhone, although you can get pretty far on your own if you don't mind SSHing into such a device and messing with stuff on your own.
If it was focused on developers, it would by definition not be re-running the Apple playbook. Developers are much too small a market to justify that level of investment.
I'd be interested to know what "we" means in this sentence.
Do you mean your team?
Or developers in general?
If the latter, I am not sure what industry you work in, but I very much doubt the majority of developers operate in the environment you describe.
it would be amazing to build systems that are are so beautiful that they inspire people like berners-lee and carmack to do their things.
Most developers I know are happy with their mac or windows spyware, or their rough edges linux rig.