It really doesn't. And the desktop Linux community's obsession with bundling a few tweaks and preinstalled applications as separate "distros", splintering the community into a million tiny sub-groups, is part of the reason it's failed to achieve mainstream popularity.
I'd agree that it doesn't matter much within a certain band of distros. Fedora vs. Ubuntu, for a new user without opinions about things like Flatpack or rpm vs. dpkg, doesn't matter much, sure, I agree. Throw in Slackware, Gentoo, Void, and Arch, and now it kinda does matter which you choose. Even Debian, since your software will be farther behind current releases and its preferences about things like non-free software are likely to be something you notice and have to deal with, one way or another.
But, among the small set of relatively user-friendly distros, sure, it doesn't actually matter that much. A generous reading would take that meaning from it, I think.
I've noticed that people tend to think of the surface level of any desktop OS.
They think about the GUI, the command-line programs that ship with it (curl/grep/ls/etc.), the driver support, and the package manager it ships with. These are all trivial abstractions built on deeper facilities.
The farther down you go, the less people understand. Who actually can express the difference between X11 and Wayland that isn't full of weasel words and equivocation? What about mesa, or dbus, or pulseaudio? These are all core components that alter the "flavor" of a desktop Linux system. And yet, all distros basically use the same off-the-shelf components. They only change the higher level GUIs and package managers and stuff.
And people GROSSLY underestimate how much the kernel contributes to the "flavor" of Linux. (I could go on and on about how the GNU GPL directly impacts how drivers are developed for the kernel, or how the small number of core devs are overwhelmed by additions for hardware drivers which move rapidly and break things, and the subsequent vulnerability patches, leaving little time for desktop-focused improvements).
People tend to say things like "the kernel just manages the hardware" or "the kernel is just an interface layer for the hardware" or "MacOS and BSD are the same, only the kernel differs". If only they knew. The kernel is like a seed crystal that defines what can grow outward from there (without massive painful compatibility shims).
Lastly, people OVERestimate the importance of things that are entirely irrelevant to a desktop OS. Just look at how many desktop Linux users are arguing over SystemD vs SysV vs whatever else. Are desktop Linux users really digging into log files, and are annoyed that the log files are now in a binary format? Are desktop Linux users really annoyed that sudo is now part of systemd, instead of a standalone binary that they can swap out? I think the number is low, but the number of desktop Linux users arguing about such things is high.
I think you have pretty much covered all the main reasons.
I use Arch (formerly Gentoo) on my work and home PCs/laptops because I like rolling releases with a bleeding edge. I generally run Ubuntu LTS minimal installs for servers because they are tiny and stable and guarantee to be upgradeable to the next LTS release. I run Home Assistant IoT wranglers on Debian because that's what HA insists on for "Supervised".
My wife uses Arch because I look after it and she doesn't care. It simply has to just work and it has for years now without skipping a beat.
Upgrading hardware for laptops and PCs means dumping the filesystems to files on a server or whatever and blatting them onto the new device. If there is physical space, put the old HD/SSD/whatevs into the new box and use a live CD like system rescue or Clonezilla. All the drivers are built in out of the box. These days most things simply work with minimal fiddling. I can't remember the last time I fiddled with xorg.conf. OK I disabled the touchscreen on this laptop when I cracked it and that involved fiddling with xorg. I remember setting modelines by hand in XFree86 ...
It gets on my nerves sometimes but then I remember the days of creating init scripts a la Miguel van Smoorenborg (with various dialects), upstart, Gentoo etc style OpenRC jobbies and the rest.
I get on with my day and you can barely see the blood dribble out of the corner of my mouth when I put the verb in the wrong position. systemctl rofl restart ... [fuck] backspace etc ... [bollocks][arse] ... hit enter.
IMO it's about time we converge on just 4-5 languages -- each with unique benefits that can't be had easily in the other ones! -- and finally start to be truly productive.
Programming is mostly a complete mess today and everybody loves their own ugly disabled baby. Sigh.
Upvoted and do like the idea, its an interesting question to see how many languages do we really need. I remember some paper from the 80s suggesting in the future all we would need is Ada and Lisp. This is somewhat echoed with the GNU projects attempt at C for low level dev and Guile for higher level dev and scripting. With no judgement of suitableness or not trying to exclude just tossing out examples, I'd guess it looks something like:
Small footprint systems language - C, Zig, Ada/SPARK;
Higher level dynamic language - CL, Scheme, JS, Clojure;
Full Spectrum Language - Rust, Red + Red/System;
Basic low level embedded - Forth;
Higher level static language - Haskel, SML, Java;
But already these start breaking down, particularly static languages are a minefield. Lazy vs eager evaluation, nominal vs structural typing are two major axis of differentiation there. Then for both static and dynamic languages the question of immutable by default vs immutable as an add-on. Further there's a question of do you need 3 languages for system, high level, and full spectrum vs picking one of the two approaches? And even this binning of 5 completely ignores to what degree concurrency should be a first-class concern of the language. Then there's the whole question of VMs and is there value to building your whole ecosystem on a shared VM? And further, now there's the problem of exposing functionality to non-developers. Should we include a tool like Lua or R that targets non-devs, use a tool like Racket or Red with their very explicit support for creating small custom DSLs, or is that a total non issue because the correct solution is to write GUI tools for that market?
Then the bigger question is, to kitchen sink or not. Languages like CL, Scala, and C++ have taken an approach of implement a ton of features and then trust the developers to sort it out. Other languages are laser focused on a single feature and then take it to its extreme, kinda like how Clojure does with immutability or Pony with actors. Yet if we don't embrace multiparadigm languages, we're leaving a ton of research on the table or accepting the zoo of programing languages.
> I remember some paper from the 80s suggesting in the future all we would need is Ada and Lisp. This is somewhat echoed with the GNU projects attempt at C for low level dev and Guile for higher level dev and scripting.
This is an even more true argument today (minus the concrete language names). We desperately need such a curated subset.
The problem is of course us the pesky humans with short-sighted feelings we cling to as if our life depends on it. Desired job security is a big offender as well.
The pertinent questions today with regards to a language are:
- Is the runtime fast (if it's interpreted)?
- If it's compiled ahead-of-time, does it produce efficient machine code? (Golang is one example of machine code that can be better, whereas OCaml and Rust are known to produce some seriously fast machine code.)
- Are the runtime's performance characteristics predictable, e.g. latency remains stable under load? (Especially if the runtime has a garbage collector.)
- Does it run on a reasonable amount of platforms? ARM, x86 (32/64 bit), AVR, and a few more? Can it run on embedded devices?
- Does the language/runtime/ecosystem give you good parallel/concurrent abilities? Preemptively scheduled actors and/or green threads is probably the best idea for servers (Erlang / Elixir are good example do due to the underlying BEAM VM). IMO this is hugely important nowadays. Stuff like parallel iterators and parallel map/reduce/join/various-transform operations are other important enablers.
- Does the language help you avoid various bugs? Examples are Rust's borrow checker or many languages' support for sum types.
---
Additionally... why do we even use languages that don't get compiled by GCC or LLVM, at all? I am tired of listening to people's stories about their beautifully simple and genius compiler... which of course can't even do SIMD intrinsics or properly unroll loops. Yeah, "genius" and super simple indeed. /facepalms
Will we ever learn? Gods.
---
I've been around and I can claim that many times people get attached to the language simply because it has good libraries and good community. I fully relate to that and it's the reason why I want to work with Elixir even after 5 years of mostly uninterrupted career with it but... at one point, after many other problems are solved, you inevitably start hitting brick walls and you question your career choices.
I can't claim any strong experience or authority even after almost 20 years in the industry but so far only Rust seems to be a very good all-around language with a dead-serious and dedicated community that's trying to penetrate basically everywhere. The good news is that they might just succeed because the language is that good -- although it too started showing some warts but so far they're bearable.
But I don't see the world putting their differences aside and starting to work on a common cause. It's sadly not how humanity works. :(
The splintering is real, but the differences between distros is also real. The pace of updates and the availability of packages is pretty important, as is a large community that you can lean on for support, even if passively by searching for solutions to common problems. Maybe you mean "of the 5 most popular distros it doesn't matter"; then sure.
Been using Ubuntu for 15+ years and never used gnome. I always install xfce4 (which has basic tiling), Plank and a few other little gadgets. It stays so much out of my way that I basically don't even notice I'm using it. I tried one or two other distros but generally come back to Ubuntu because it hits the sweet spot of modernity and stability.
Just don't. Why bundle stuff if the user can just install it using the package manager?
It's easier to add to the system than remove redundant packages, and if the distro developers focus on the repositories and package management software it benefits everyone much more. That's one of the big reasons people go for Arch.
Nobody cares if LibreOffice or Inkscape or whatever is preinstalled. Just put a usable appstore on the taskbar by default, it works for smartphones too. Many distros are about as necessary as Samsung's or LG's customized Android builds.
Some distros are far less buggy then others... IE: I don't have any issues in Arch, but I did with Manjaro (often there was bugs that prevented updates to complete successfully).
I've been using Linux for years, so I've got my opinions about distros. Why should the distribution a new user pick matter that much? They can easily switch if they don't like the first one they pick.
Well, IMO, a new user should pick a distribution that has different DE's available in it's repository. That makes switching and trying different DE's very easy. As opposed to some distros which are specifically tailored to only a specific or supported DE.
What happens then is that the new Linux user soon needs a newer version of some package and they have to add a third party repo to their stable system and after having done this a few times, soon an update from one of them will cause the install to be unbootable.
I've had a way better experience with desktop Linux when using a managed rolling release like Manjaro.
That's the point I stopped reading.