
Distributions are becoming irrelevant: difference was our strength and liability - slyall
https://blog.flameeyes.eu/2017/06/distributions-becoming-irrelevant/
======
msl09
> I tried getting the Debian Ruby team, who had similar complaints before, to
> join me on a mailing list so that we could discuss a common interface
> between Gems developers and Linux distributions, but once again, that did
> not actually happen.

2-3 years ago debian called out for python developers to help port and all the
remaining python2 packages to python3 so that in the future we could drop
python2 support. Being pretty familiar with python and somewhat familiar with
migrating python2 projects I decided to join the project.

The process of porting mostly involved trying to talk to existing maintainers
and seeing if they are already porting the packages and if they are not
suggesting that you could help if they needed. I have sent dozens of emails
and I got maybe 2 replies. Those were direct emails to the maintainers.
Eventually I got so frustrated with how slow and disorganized the process was
that I decided to drop it.

When I commented that with other debian contributors they expressed that this
was a very old problem and that there is not much that can be done about it
since the debian developers are volunteers and likely have other
responsibilities other than maintaining their packages.

In retrospect I think debian in particular needs an automatic way to ask
maintainers every now and then if they are still willing to maintain their
respective packages and if they are not or do not respond in a timely fashion,
put them up for adoption.

~~~
vbernat
Most Python packages are team-maintained in Debian.

I don't know what you mean by "porting", but if it's just deleting Python 2
packages, that won't help: they are easy to maintain and they are still
useful. If you mean porting the actual code of each package to Python 3, it's
a work to be done upstream. Packagers have enough to do with packaging.

~~~
gkya
I guess the subject is debians own programmes written in Python 2, not third
party code that they packaged.

------
eeZah7Ux
Yet again the HN crowd ignores that the large majority of software is not
deployed by developers.

Most organizations are not facebook; they deploy from distributions and pull
security updates and that's it.

Maintaining systems duct-taped together with
compilers/pip/npm/$container_tool_of_the_year becomes very expensive in the
long run and with tenths of Frankenstein systems around.

Also, hunting and patching vulnerabilities becomes a big effort due to static
binaries, vendorized libs, and so on... (hello left-pad!)

~~~
mook
The large majority of the software isn't deployed by developers, but it is
written by them, and it's rather difficult to deploy unwritten software. And
if the people who deploy them have to write it in the first place, they by
definition become the developers.

The issue can only be solved by catering to the needs of both parties. For
example, the currently available distro packaging formats tend to require
privileges to deploy, which complicates the development cycle. (Exceptions do
exist, of course; for example, Nix doesn't have this particular issue)

------
oelmekki
As a distribution user who is a developer for other things, I can totally
relate, here. Nowadays, I use docker for basically every development I make,
so that I don't have to rely on whatever state my host system is. This is not
just to bypass distributions problems, it has advantages on its own, like
easily using different version of interpreter/compiler without even thinking
about it, and easily ejecting the whole environment when I don't need it
anymore. My server is structured just like that too : a few sysadmin tools
installed on host, then plenty of docker images/containers. It's just more
convenient.

As an end user browsing web, watching movies and playing on steam, though,
things are quite different. I've been a gentoo user for a decade, but I
finally migrated to ubuntu (kde neon recently, but it's still ubuntu based)
because it _was_ becoming the standard. You can expect most vendors to have at
least ubuntu in mind, and it's a big win as an user. And it's not as painful
as it was as a developer (my main reason to use gentoo was use flags and fine
tuned configuration), because docker covers all my needs there.

~~~
vidarh
I'm increasingly using Docker (and to a lesser extent systemd-nspawn and rkt)
to containerise things even on my laptop). For very small tools it doesn't
make sense, but the time it takes to start a container is low enough that I
have an increasing array of things where the only thing installed locally is a
wrapper that does "docker run ...".

But in terms of distro, I agree - Ubuntu is painless enough that even though I
e.g. use i3wm and otherwise install a bunch of stuff that is non-standard,
Ubuntu is sufficiently convenient as a starting point.

~~~
majewsky
Does `docker pull` do any form of signing? Because if it does, I've never seen
it. I'm just waiting for some adversary to insert some malware in a popular
library image (something like nginx:latest or ruby:2.3) and have that deployed
on thousands, if not millions of servers. Assuming, of course, that this has
not already happened ( _cough_ National Security Letter _cough_ ).

System package managers have protections in place against that: Package
signing means that you now have to hack the packager's laptop and obtain their
signing key to publish malicious packages, and reproducible builds will close
that attack vector as well. (Which is why I'm very disappointed that
distributions are not pushing reproducible builds as a top priority. If
nothing else, this might make me switch to NixOS soon.)

~~~
ptman
debian is pushing reproducible builds:
[https://wiki.debian.org/ReproducibleBuilds](https://wiki.debian.org/ReproducibleBuilds)

~~~
majewsky
I'm aware of that but that's only a few guys working on that AFAICS. "Pushing
it" would be if the core devs said "all new packages after day X _must_ build
reproducibly", and then "on day Y, we will remove all packages that do not
build reproducibly".

~~~
ptman
Well, debian is a volunteer effort, it's hard to force stuff on people that
will just stop contributing if it doesn't feel fun enough.

------
jacquesm
This goes much farther than Linux distros. All of open source suffers from
fragmentation and re-implementation of things that already exist rather than
to contribute to the existing source base.

There are many causes underlying all this (ego, commerce, abrasive/lazy
maintainers, NIH and many others besides) and it is something that I believe
only time will solve. It's like music, a decade on you'll be able to determine
which ones were the evergreens and had staying power while the rest will wilt
and be forgotten.

For now I've bet on Debian/Ubuntu, Python and Erlang, I believe those three
will still be with us a decade or more from now.

~~~
digi_owl
> a decade on you'll be able to determine which ones were the evergreens and
> had staying power while the rest will wilt and be forgotten.

If only. A decade from now, many of the existing projects will have died not
from code quality but from project maintainer churn.

This because the people that started the project to reimplement the old
project has run out of features to implement, and thus is faced with the
prospect of slowing down and fixing bugs. This to them is not fun, so they
call out for someone to take over.

More often than not, the person taking over is barely interested in the
current code, mostly there for the glory, and invariably ends up deciding
unilaterally that the current code is unmaintainable and thus a full rewrite
is in order.

~~~
jacquesm
True, code quality is not the right measure but I wasn't really making that
argument. Staying power also implies a lively community and one that can take
over maintenance and further improvements once the original team stops working
on the project. Any project that is still under 'original maintenance' is as
far as I'm concerned a large uncertainty.

One of the ones that I'm particularly worried about is the Linux kernel, if
and when Linus gives up working on it there will be a period of some
uncertainty.

------
joelthelion
I don't think they are. Ideally I think people would use:

\- Flatpaks for one or two applications they deeply care about and use all day
(need to have full control)

\- Pypi, npm, etc. for development libraries (again, you need more control and
need to be able to use different versions from different projects)

\- Your distro's package manager for the 95% remaining.

You might not care as much about the remaining 95%, but you still need them
and want them to be reliable. If anything this will push people towards
stabler distros (think debian stable instead of Arch).

Edit: Pypi, not pypy

~~~
omegote
> PyPy

I think you mean pip

------
TekMol
For me, my distribution is highly relevant. It's my single point of trust
which native software I use.

I never install anything that is not in the repos. All other software I use
are web apps.

~~~
jrimbault
It seems sad to me, living like that in the confines of someone else's
choices.

There are loads of good software that just aren't in any repo of any Linux
distros.

IMO, one of the advantages of Linux is how easy it is to find and install
software from source. Compared to Windows where it's an almost unheard of
practice.

~~~
wolfgang42
_> It seems sad to me, living like that in the confines of someone else's
choices._

Counterpoint: Sometimes the confines of someone else's choices can be freeing.

Several years ago, I decided to switch to Arch. In the beginning, this was
incredibly enlightening: I could set up my system however I wanted! I could
make my own packages if the one I wanted wasn't in the repository!

Soon a sort of fatigue began to set in: whenever I wanted to install a piece
of software, I had to first decide which software I wanted. Every new little
task turned into a large project: I wanted to add up a column of numbers, and
now I had to decide between gnumeric, libreoffice-calc, and so on.

I eventually wound up switching back to KUbuntu purely because it had more or
less everything I needed pre-installed, so I could focus on actually doing the
work I wanted to get done rather than figuring out the optimal way to do the
work. Any given task may be slightly less efficient than optimal, but in total
the effort spent is a lot less for the same amount of work done.

~~~
jrimbault
That's a good argument. And I feel that more and more as the years pass.

I'm still loving Arch and all the DIY attitude. But I feel like the next time
I'd have to change my computers I could go with something "It Just Works™",
currently Fedora seems in a good place.

------
nippples
I've noticed that I don't care any longer so much about which distributions
are available, but more about the preferred packaging system of the
distribution + how up-to-date their repositories are.

With many programming languages having their preferred tool for circumventing
a distro's packaging system, that too has impacted the importance of a
distro's packaging systems a bit.

Still, I think it's excessive to say that "distributions" are becoming
irrelevant. There's plenty of opportunity for distros to compete to provide a
good out-of-the-box setup, fine-tuning tools, or dumbified interface.

------
tpaschalis
What about the end user? The >90% market share of Windows is made up of people
who won't be tinkering around their distro, or care about things as package
managers or custom desktop environments.

But is it possible to grab a larger piece of the pie? That's millions of users
that could benefit from a free, secure, lightweight Operating System.

~~~
gkya
Unfortunately that requires a system that they would want to use, which is
detrimental to what makes those OS favourable to us.

------
belorn
> And setting aside their trust in distributions, sysadmin only care to have a
> sane handling of dependencies and ... upgrade them in case of a security
> issues

As a sysadmin, default configuration is a key role that distributions have. Do
they default to sane (opinionated) defaults, or do they have a hands-off
policy with no default and pushes the user to actively configure the program?

Debian for example is not just a package manager. It is also the debian
policy. If ones is using hundreds of Docker containers then there could also
be an identical number of policies, until a point where a common policy
dictate what a container should include and do. At that point you are back to
a single distribution where the package manager just happened to be based on
Docker.

------
dredmorbius
NB: Debian announced it would drop LSB in 2015, and as of stretch, which just
released, I understand most library support based on it has been removed.

[https://lwn.net/Articles/658809/](https://lwn.net/Articles/658809/)

[https://www.debian.org/releases/stretch/amd64/release-
notes/...](https://www.debian.org/releases/stretch/amd64/release-notes/ch-
information.en.html#reduced-lsb-support)

------
gtirloni
This article is written from the perspective of someone working on creating
Linux distribution(s). I agree many distributions lost their appeal because
now it seems most people are happy with having two options: Debian/Ubuntu and
Fedora/RHEL/CentOS. There's less appeal in creating yet another distribution
and I think that's a good thing for developers and users alike.

My initial impression from the title was that the author would be making the
point that _no_ Linux distribution is necessary. Although I can understand
that, on servers, developers care less and less about where their apps are
running because of containers and other tools, people still do have to select
a base Docker imagine... and boom, distribution choice again (unless you're
shipping a single static binary, of course).

Except for things like Nix and other non-traditional methods of configuration,
outside of the main DEB/RPM camps, all other distributions are trying to solve
the same problems in slightly different ways... I'd argue the disadvantages of
that outweigh the benefits.

------
pjmlp
After being a strong GNU/Linux zealot during the early days, it started to
become irrelevant to me the moment I stop caring about POSIX.

As long as my tools of choice offer rich libraries and packaging systems, the
actual underlying is no longer relevant unless I require some special use
case.

Looking at how Linux distributions turned out, I think GNU/Linux developers
failed to learn what happened with the UNIX wars.

~~~
gtirloni
Could you elaborate on what kind of application you're talking about and what
are its requirements?

If your app is insulated enough that it doesn't care about POSIX features,
aren't you simply outsourcing those requirements to underlying components (and
those will care about POSIX)?

~~~
d3ckard
Isn't delegating(outsourcing) the whole point? Once you do not need POSIX(and
POSIX itself does not require Linux specifically) and your tools take care of
underlying system, why would you care about system you're running(or, to be
more precise, care about running Linux in particular)?

------
digi_owl
Frankly no. The problem is not with distros, but with upstream.

Having tried to self-maintain a distro that allows me to do so, Gobolinux, i
have witnessed first hand the kinds of mess upstream makes of dependencies.

All too often even a slight bump in minor versions will cause subtle bugs and
breakages in behavior. Update some underlying lib just a few notches and
suddenly your file manager's trash can is gone. Downgrade the underlying lib
and it returns.

But then some other program you want to upgrade insist on using the latest and
"greatest" libs, or have depreciated support for one protocol in favor of
another, just because the maintainer likes it better.

Thus even with Gobolinux offering me the ability to stack lib versions and
whatsnot, all too often i just simply have to freeze everything in place once
i have it reasonably where i want it.

Change anything, and i have to change everything, and thus a full reinstall
may well be in order.

~~~
stephengillie
Windows addresses this by maintaining a "side-by-side" set of library
versions, so applications can have several versions of a library available to
them. This is what the \Windows\WinSxS\ folder stores.

~~~
digi_owl
At first glance WinSxS strikes me as a centralized variant of what Windows
have done since the early days, search the local sub-folders near a binary
first before looking elsewhere for a relevant dll.

Basically this is not much different from what is being suggested with
Flatpak, Snap, Docker, etc etc etc.

Never mind that unix already offer something similar with SONAME. This is a
naming scheme for lib files that put a interface version of sorts into the
name. Then symlinks are used to point general names to more specific version
names.

This on paper allows the binary to get the right lib version for the job. But
all too often either the lib maintainer do not update the soname properly, or
the maintainer of whatever program that use the library is not specific enough
regarding the soname.

Frankly the problem is not so much technical is it is managerial.

Of the two major UI libs on Linux, Qt keeps surprising me with its backwards-
compatibility. I can apparently get Qt5 compiled againt Xorg versions that
sends GTK3 screaming about missing dependencies.

This, i suspect, is because Qt is a for profit project that is dual licensed.
Thus there is an incentive to be as broad as possible with support.

------
contingencies
Obviously, build automation will continue to be a critical part of all
distributions, pre-configured defaults will continue to be a subjective
matter, and various takes will always exist on the optimum degree of knob-
twiddlability for any given user community. This is just technical reality and
human nature.

I don't see any real point here.

PS. IIRC this guy used to manage Gentoo's build infrastructure, dropped off
the radar owing to some kind of professional burnout / direction change a few
years ago. While I have immense respect for anyone who can deal with
bureaucracy long enough to actually become a Gentoo developer, personally I
don't see this guy's personal rants as representative of much at all.

------
ekianjo
Nobody is going to use containers for every library of their distro. So distro
have, by need, a reason to exist along with their package manager. Having
container to run larger applications is an additional way to run software,
nothing else and certainly does not make distro any more obsolete than they
are now.

~~~
digi_owl
Watch Freedesktop/Gnome give them no choice in the matter, as they depreciate
aggressively anything not using Flatpak...

------
xj9
\> If we all reacted the same way, we'd be predictable, and there's always
more than one way to view a situation. What's true for the group is also true
for the individual. It's simple: Overspecialize, and you breed in weakness.
It's slow death.

UNITY IS DEATH

