Hacker News new | past | comments | ask | show | jobs | submit login
Leaderless Debian (lwn.net)
119 points by Tomte 10 days ago | hide | past | web | favorite | 117 comments





Controversial opinion warning, and I would love to hear some counterpoints outside of my thought bubble.

I've never understood the appeal of Debian or Ubuntu, or I guess I don't understand why people would invest so much time in them compared to Fedora and Redhat. In actual buisness dealings Redhat has always led the way in enterprise (whole data centers), kernel development (a bunch of the maintainers work for them and they lead efforts like cgroups v2), and other highly technical pieces of the ecosystem, including acquiring all of the start developers (such as the systemd folks). Investing in Redhat is an investment into a deeper understanding of LInux through their excellent documentation and well paying positions through their famously good certifications.

There is of course a lot of debian and ubunutu out there in small deployments, usually individuals, small buisnesses, and education, but they are ultimately less lucrative and more oriented towards beginners/hobbyists. At least Ubuntu used to be. I'm not really sure who the target audience of debian is.

I've also never understood the appeal of .deb or apt, or what debians end goal is. They never seem to be doing anything that interesting except grinding away at repackaging everything. Canonical is just flailing to find some sort of niche and historically (Ubuntu phone, Mir, Open Stack, Upstart, probably snap) they are incredibly bad at finding relevance. I think having multiple packaging ecosystems is the stupidest idea in the world, and don't see the value add except giving people the illusion of more choices.


For a long while Debian and Ubuntu were the obvious choice for a free-as-in-beer (and not just free-as-in-speech) server OS. Fedora doesn't have an LTS program, and CentOS, Scientific Linux, et al. were unofficial. (CentOS is now more official so that benefit has gone away a bit.) Debian has an enterprise-compatible release cycle - a new release every 2 years give or take a few months, each one gets security updates for a year after the next release is available - and you can use the same OS the Debian developers use. Ubuntu has an even-more-friendly LTS release cycle, also every 2 years with security updates for 5 years instead of 3.

About 10 years ago I was responsible for some servers running Fedora and some servers running Debian. Fedora's update cycle was way too fast, but we also knew that switching to CentOS would get us software that was way too behind to be useful, and we'd be responsible for building a lot of stuff in /usr/local ourselves.

A small technical difference is that the dpkg ecosystem has a feature called "diversions" that let you cleanly say that you want to replace a packaged file. A config management system can use this to avoid having config management and the package manager fight each other. I'm the current maintainer of https://github.com/sipb/config-package-dev , a tool that lets you build Debian packages that express system configuration, and I've ended up using it at 3 of my last 4 jobs. For the Fedora servers, we were using some mix of checking out /etc from Subversion and rebuilding packages on our own. We surveyed our options and determined that we weren't really better off with any of the "real" config management options at the time (to be fair, modern config management hadn't really taken off then) and the lack of diversions meant we were better off rebuilding packages that we wanted to make changes to.


Out of interest how is diversions different from the alternatives command

In alternatives, packages write their files to the disk and then register those as alternatives. A user can decide which one should satisfy the alternative.

In diversions, a package claims a file. When anyone else tries to overwrite the same file, it's diverted elsewhere. It isn't designed for the user to choose between equals.

I built a derivative distro using diversions mainly to prevent "soft-forking" Debian packages as much as possible (maintaining packages is extremely costly for anyone that isn't a decentralized organization like Debian)


Because 10 years ago ubuntu supported way more configurations out of the box and was more user friendly for laptops and desktops. And now nobody wants to spend time trying something new: don't fix what's not broken.

I have no incentive to ever install anything else than Ubuntu on my laptop given the investment I've done to get used to it. And on servers, I use Ubuntu just because I know it.


I think a little Distro hopping is good for the soul. I have certainly had my time with Arch and Fedora but I have settled with OpenSUSE because it is to me the perfect distro. You can get OpenSUSE Leap a typical disto with release cycles and then there is OpenSUSE Tumbleweed which is a rolling release.

By far the best cli installer with zypper and that they make it super easy to build and install with their own build system and uses repos is just best in class. All done online and can be one click just makes me wonder why the community still avoids them.

https://software.opensuse.org/find


Ahh SUSE... the one I wanted to use 2001-2005 but didn't because it was a big greedy commercial company behind it.

The idiocies of youth. Now I kinda wish I'd made that commitment. Must check it out again.


SUSE was my first distro in college back in the late 90s. My first job was my exposure to Debian, and I've been on it since. Distro hopping is fun. The issue with OpenSUSE is it's huge unless you do a minimalist install. yast is a great tool, however. Debian should have something like this. I agree zypper is a nice package manager, and it's fast.

Can you sell me on SUSE a little bit more?

^this plus literally 1999 was my first Debian install. Three floppy disks, what could be the problem? It installed on the first machine I threw at it (thank you 3Com and Intel). And it solved enough problems that I knew I could throw it at the next system I installed. & etc.

The thing is, for a long time, deb/apt were way superior to what rpm/yum offered. Heck, Debian had apt way earlier than Red Hat had yum.

If anything, Red Hat should have adopted apt.

In terms of sheer number of dumb competing initiatives, Ubuntu > Red Hat > Debian.


Deb/apt might have been nicer to use, but as a packager I found the packaging format extremely horrible to work with.

There's a very complex interaction between the various files which define a package. It's extremely hard to find out what debhelper and its associated commands are actually doing and to debug them, particularly if you're trying to split a package into multiple parts. I gave up several times trying to make my package before finally succeeding with a simplified version. I'm not sure I will understand how it works when I go back to it.

On top of that the sheer volume of rules and regulations, different packaging helpers and out of date suggestions just added to the problem.

Deb has far too much magic. Rpm is much clearer. I can define a simple configuration file which takes the source, patches it, builds it, and describes the packaged files. I've never had a problem understanding an rpm source package.


That's because APT serves the user rather than the developed.

As a sysadmin, of one box or 100k boxen, I'd have a hell of a lot less apprehension about installing a DEB than an RPM. Thanks largely to Debian Policy, which governs how packaages will behave.

The fact that Debian has an order of magnitude more in-repo, in-distro packages also counts for something.

As well as the fact that I can start with a minimal installation and add to it only what the box actually needs, a process Red Hat and RPM have never adequately supported.

Source: Have run both, personally and professionally, for nearing a quarter century.


I upvoted you. Yes, APT was the major reason I chose Debian 15 years ago. It was much better than RPM without YUM (it wasn't part of RH systems back then IIRC).

Nowadays the competitive edge of APT could have diminished, but I never looked else where because I have been quite happy with APT.


Yum has been around for so long that I really can’t believe people still bring this up. Even a decade after yum came around, I still heard Debian people comparing “rpm” vs “apt”, which is a completely apples to oranges comparison. I can only assume that those Debian people were so out of touch with other distros that their opinions could not be trusted.

Yum has been largely replaced by dnf, which is far and away better. Despite being a Debian/Slackware fan since 1998, I do find dnf quite nice. It's still a fair bit slower than apt or pacman, but it's clean and easy to use. Slackware actually has package management systems, but I encourage people to learn how to do things the old school way so they can learn Linux. If you know Slackware, you understand how Linux truly works. Package managers are great for production systems, especially stuff like apt and for the odd systems with software hangups, apt pinning.

Yes, but my point is that Debian people always seem to cite “rpm hell” with regard to manually tracking down rpm dependencies, while ignoring the fact that that hasn’t been a problem for decades.

I’ve never understood the pure love for apt. Ports/pkg is better. Gentoo’s package manager is better. Even yast was better (imho).

Once I needed a package from an external repository, apt has almost always fallen down into a pit of duplicate and incompatible dependencies. Maybe it’s better now, but I was burned long ago and never looked back.


I haven't had a problem with apt since I was using Debian Unstable (sid) back in 2005... and even then, I think I installed stuff from other repos that conflicted with the base system.

Portage is cute but it's tied to Gentoo and few people want to compile their own packages.

Apt was one of the earliest package managers for Linux, for a popular distribution. That was enough to give it a head start regarding packages available. And that's all you need, unless the other solutions are obviously superior from the user perspective, which they're not.


> Ports/pkg is better. Gentoo’s package manager is better.

Strongly agree. Gentoo's Portage is the best experience I've had with a package manager on Linux.

My package management observations, as someone who's been using them for... a while, now.

Best experiences: Portage on Gentoo (lots of options; many packages; fast; reliable unless I broke it doing something stupid; maintaining a high level of control over the system at a low level is the only way I've ever managed to make Linux behave itself), Homebrew on MacOS (great package selection; casks are wonderful; outstanding CLI; keeping user packages away from the core system has really grown on me)

Middling: apt on Ubuntu and Debian (OK but increasingly feels dirty after getting used to Homebrew and keeping my user packages separate from the system; breaks "on its own" more often than Portage did, though still rarely; package selection just OK)

Bad: Yum/RPM on Red Hat, Fedora, and long ago, RPM on Mandrake (sloooooooowwwwwwwww and RPMs seemed to break constantly back when I used them), Macports on MacOS (slow; broke so bad the easiest thing to do was to delete the whole dir and start over about once a quarter; limited package selection)

Incidentally, Gentoo's OpenRC is the only Linux init system I've ever used that I liked and felt like I actually understood well enough to use it without apprehension or constantly needing to check documentation.

I like the idea of Nix and tried NixOS once but after an hour or so of following a guide and troubleshooting couldn't get X running so gave it up as "for those with more free time than me—try again in a few years".


Portage and OpenRC are the two reasons that I keep Gentoo as my daily.

I try about 1-2 other distros every year (in VMs or a spare laptop), and I always stay with Gentoo. I won't go so far as to say that it's objectively better, but after 10 years or so just nothing else compares for me.


Linux had/has more mainstream support than FreeBSD, unfortunately, so ports/pkg never really got the attention and love it deserved.

So I've heard this before, and I really want to know what are the things that make apt better? The only thing I can see is that yum was pretty slow (dnf seems to be marginally better). Otherwise, the rpm package format seems pretty on par with deb and apt seems like a super crusty interface with multiple commands and stuff.

Debian had a much more rigorous packaging policy than Red Hat/Fedora Core early on (the same policy that slows down development and annoys developers today, as you can see in the various recent HN threads) and way more packages in the main repos.

Using RH easily led you into an "rpm hell" of packages from various third-party repos that conflicted: https://en.wikipedia.org/wiki/Dependency_hell#Platform-speci...

Now they're both comparable. Yum/dnf are ok, the rpm repos are big. And apt actually has "apt", the tool, that consolidates those separate tools you mention. Before it had aptitude (since at least 2006 or so, I'd say) but it wasn't used much, but now apt is installed by default and you can use it. No more "apt-get", "apt-cache", etc.

So today, nothing makes one of them substantially better, IMO.


apt is a dependency resolver, and both apt-get and aptitude (an alternative implementation) are good at dependency resolution. Pre-yum (so for RHEL before 2007), there wasn't a good dependency resolver on RHEL. yum and dnf have each improved significantly on this, but it's a lot nicer to get the packages you need and know that they're compatible than to find RPMs and install them and see if things work.

As a consequence, Debian packaging policy is (in my experience) a lot more aggressive about using detailed dependency-resolution metadata to establish constraints and define compatibility, which makes for a more annoying developer experience but a nicer end-user experience.


I will one thing: apt pinning[1]. It allows you to tweak the way apt choses the version to install, and allows you through a configuration text file to give higher (or lower) priority to specific releases, repository components, repositories, package names, etc.

I started using it before 2005, before backports was a thing, and it was a lifesaver for my desktops, as I usually wanted to use newer versions from testing or unstable.

[1] https://www.debian.org/doc/manuals/debian-reference/ch02.en....


> I really want to know what are the things that make apt better?

For apt as a format probably not so much anymore. For the Debian apt repositories it's the several thousand capable and driven maintainers.


Why Debian? It values freedom:

  A) of software - DFSG
  B) of decisionmaking - Debian constitution
  C) of choice
Stemming from that, there are the following advantages I value (opinion ahead):

  * both installed program and source is just an apt command away (A, C)
  * has a large repository of software (historically largest; nowadays it's still among the largest) (A, C)
  * security support for stable release for the whole archive (cf. Ubuntu, RHEL) (A)
  * software from repository aims to play nice together (C)
  * true community distro, independent from any single vendor (B)
  * multiple kernels, multiple desktop environments, multiple init systems, a truckload of architectures (C)

True community distro is what I was going to say as well. I view Debian as 'plain old linux.'

My fear though, is Debian is not going to be able to keep up with the pace of commercial linux distributions. Things are moving at light speed these days.


Not keep up in what way? Package version freshness? Relatively recent kernels?

Debian is a community project based on the practical willingness of individuals to work on fixing problems, usually ones that affect themselves. Red Hat is a company solving problems for customers. Two different approaches that sometimes align into similar solutions and sometimes don't.

A good example is the Initramfs-tools that debian created. Before (most, all?) distros just had a bunch of shell scripts mostly concatenated together, but hard coding that for every possible configuration that Debian allows was painful for the maintainers so they created initramfs-tools. They did not have a bunch of consultants which could sell handcrafted solutions to customers.

Red Hat decided many years later to create a similar solutions, but for "reasons" did not want to port initramfs-tools and created Dracut. Debian is now considering porting dracut since maintaining initramfs-tools is a lot of work which could be spent elsewhere. The debconf bof on that was basically the maintainer asking "OK, anyone interesting in continuing maintaining initramfs? How did the test port of dracut go?".

The incentives and politics in a community project are very different from those within a company. Not always better for everybody, but very distinct.


I guess I don't understand, because Fedora is also a community driven project. Anyone can contribute to packaging, steering, designing, etc. not just Redhat employees.

Fedora was not the one making the decisions in regard to the rather specialized area of initial boot. Why that is can only be speculated, but I would hazard a guess that when there is an incentive conflict between what is best for redhat and what is best for fedora, focus of both will go where they align. Path of least resistance and all that.

I would also guess that the small number of developers that work within such niche area was basically all hired by redhat during the earlier days. Not with nefarious intention mind you but simply out of good business practice.


This is anecdotal, but a trend I am hearing from companies in the cloud hosting sector is that because Ubuntu has been a stronger desktop OS (better at "just works" out of the box, particularly on laptops), developers who use it for their desktop then choose to use it instead of RHEL for servers when they are deploying in their enterprise jobs (use what you know). The result is a rapidly growing usage of Ubuntu for enterprise servers. RHEL is still dominant, but the usage of Ubuntu is significant and continuing to grow quickly.

For me: Last I checked, debian had a lot more packages which worked together (Wikipedia says it's currently ~70,000 to Fedora's ~20,000, and RHEL's "3000 + 9000 from EPEL").

Notably after running Debian on my servers for years, I tried getting the same setup on RHEL to see how it compared - starting with getting nginx running. On debian it was "apt install nginx". on RHEL, nginx wasn't available from the main repo so I had to use an unofficial third party. The third party binary didn't work so I tried to compile from source. Compiling from source failed because some libraries were missing. Those libraries also weren't in the rhel repo. The third party repo had some, but they were out of date. Trying to compile the libraries from source, one of them used Scons rather than Autoconf, so I tried to install that. RHEL didn't even include the basic popular (at the time) build tools, so I decided to give up rather than compile my entire stack from scratch bypassing the package manager :P


When Ubuntu came along, it was VASTLY more friendly for laptop and desktop use than the other flavors of its day. Hardware that required me to tweak config files in RedHat "just worked" with Ubuntu. It was much simpler to install Java for development. Etc.

So I read all of the holy wars about APT vs. RPM, Unity vs. whatever, etc. I can imagine that LTS support is important to a lot of people.

But honestly, for me I just spent 10 years learning Ubuntu on my local machine. So in a situation where a developer like me gets to influence which OS will be used on the servers (a situation that happens surprisingly often!), I'll generally go with what I know. I suspect that this is the origin story for most of Ubuntu's base.


Ubuntu had saner default desktop package sets than most other distros at the time, for a good long while.

You wanted Firefox. Everyone wanted Firefox, pretty much. Obviously. Yet other distros liked to package up Epiphany or some such weirdness when you let them give you the deluxe desktop software installation. Same for lots of other categories of desktop software.

Ubuntu's defaults and the options they pushed to the front were consistently the right ones, for the time. I can only assume politics and oddball personal preferences were behind every other distro getting that so wrong for so long. And yeah it didn't matter that much if you were fine picking your own packages, but Ubuntu tested with their defaults and things worked pretty smoothly with them, plus they were mostly the programs you'd have chosen yourself, anyway. I've never seen another distro where I could just say "gimme the desktop, the whole shebang" and be more or less ready to get going when it was done—all the others it's "no, god no, give me barebones and I'll install what I want".

This is early Ubuntu, mind you. I stopped using it (and, for the most part, desktop linux generally) when they brought in PulseAudio and wrecked audio—previously among the most reliable things on my Linux systems, for almost a decade—and I basically rage-quit Linux.


Yeah, I think your opinion that Red Hat leads the way in the technical foundations of Linux is quite controversial.

From my experience, Ubuntu is significantly more developer friendly and better documented than Red Hat products. Linux is used in a huge variety of ways, and in my experience Ubuntu is better at accommodating a plurality of contributors from different affiliations. Canonical also funds lots of critical Linux infrastructure work - perhaps not as much as Red Hat, I haven't compared. But my point is that I think they fund more useful things, and are better at building what feels more like a community - which yields massive dividends down the line.

This is not to say that Ubuntu is without flaws. Even with PPAs, packaging for Ubuntu/Debian is a massive pain, the recent APT vulnerability showcases shortcomings in their infrastructure, and Canonical thrashes on a lot of technologies that they really shouldn't thrash on. But in the long run their strategy is better at attracting developers, and I don't think you need to worry about them finding relevance.


I've found the opposite is true. Ubuntu loves to make ubuntu specific things. They made Unity, which eventually failed and led them back to GNOME. They made Mir, which eventually failed and led them back to Wayland. They made Upstart, which eventually failed and led them back to systemd. Fedora always chooses what the majority of the community wants to use and then ships the most vanilla version possible.

Fedora chooses what Red Hat develops. And Red Hat are building their own OS, with a lot less of the modularity and customisation that has historically been possible in Linux.

Because once getting started ~20 years ago (first with FreeBSD, then dabbled in Slackware, then Debian => Linux) re-learning the location of config files is not a priority and indeed I would avoid like the plague. Remembering slight syntax changes between distros / CLI switches that work on one but not another / similes (does this platform use "remove" or "uninstall"?) is the bane of my life.

Also I have never got on with yum. I still don't. It's a mystery of a program compared to apt or pkg.

Learning RHEL would've been useful (but not essential) in my career, but 20 years ago that wasn't obvious.


I guess that's why I am sticking with Redhat. I don't really want to maintain two ecosystems worth of cognitive load, and I have yet to work for a company that didn't highly value RHEL skills, whereas Ubuntu has only been valued at help desks for desktop support.

Also, the Redhat docs have been great training material, and I think they are worth paying money for.


What you say is true, but this is changing. Red Hat became valuable as a set of skills, because they developed a certification system that is provable. Those are not paper tiger certs. They are earned. While I don't personally have one, most of the people I know who work on CentOS/RHEL for a living are quite capable with other distros as well.

Debian/Ubuntu are used in surprising places, though, as you likely know. Ubuntu and Raspbian on a Pi are great at prototyping stuff. I've used Raspberry Pis for professional projects, and at my last job, a couple of Pis running Pi-hole to blacklist ads, beacons, and trackers for a public library that didn't have a lot of money in their budget for a more "proper" solution.


I don't disagree, although I think the underlying OS is becoming less important. I'm thinking about Kubernetes, which on GCloud runs on top of a modified Ubuntu. I do also see debian being used more in the embedded/low power space. But RHEL and friends seem to win in the high performance market. I also see a weirdly large amount of FreeBSD and BSD derivatives.

I know of at least two FAANGs that base their internal Linux distros off of Debian. If you want an enterprise OS with certifications and a friendly embrace from IBM, why not just use Windows?

> If you want an enterprise OS with certifications and a friendly embrace from IBM, why not just use Windows?

The accidental (?) irony in this is just awesome :)

https://www.redhat.com/en/about/press-releases/ibm-acquire-r...


Investing in RedHat is typically also investing a lot of money to lock yourself in to a larger ecosystem. I know why they do it, and that's because it's easier for support, but requiring you run RHEL to use openshift (maybe that's changed?) is a real problem for a lot of companies. Also, RHEL and Centos' packages tend to be extremely outdated in the name of security. For some of us, those new features are needed.

what drew me to CentOS was all that comprehensive free documentation produced for RHEL that no other distribution seems to ha e. I don’t think that can be done if you have consider package updates as in Deb

RPM-based distros certainly led server OEM shipments back in the day, but that didn't mean Debian wasn't being used extensively in the backoffice.

As others have pointed out, APT played a critical role in Debian's popularity as open source software grew and early naive software distribution methods were outpaced by complexity.

I won't weigh in on Canonical's broader presence across verticals, technologies, etc., but Ubuntu is widely recognized as the most popular Linux in cloud platforms. I believe this is a direct result of APT coupled with a more rapid release pace.

RPM-based distros remain extremely valuable and extremely popular in many scenarios, particularly when someone has to run a product from a proprietary ISV or something that has a very complicated certification model.

In some cases, but certainly not all and in my opinion in a decreasing amount, the choice of distro might be tied to how active a particular company is in the upstream kernel. "Buy Red Hat because they contribute to the kernel". Suffice to say as decision makers perceive that their app isn't the kernel, that decision criteria is shifting (thankfully, many RPM-based vendors including Red Hat have many other value items to add)


Because Debian is not owned by a corporation. Who knows what IBM will do with Red Hat. We all know that massive corps that buy smaller ones usually don't treat the smaller corps very well. They often buy them for the patent portfolio or tech, merge it in or kill it outright after a time.

While Red Hat has determined (sadly) the overall direction of Linux development with things like systemd (don't get anyone started on this one), PulseAudio, et al, does anyone not think this will not end up tragically?

Independent distros like Debian (who sadly adopted systemd) and Slackware (now the oldest continually developed distro, and thankfully not using systemd) need to exist. Linux hackers, both pro and amateur, need a place to tinker that is not always being broken. Both Debian and Slackware are fairly conservative distros that have a long development time. I for one favor this long development time for things I need to be rock solid.

Look at the development space as well as PaaS, Web hosting, etc. Ubuntu dominates this space. With one exception: people that need CPanel. Ubuntu is generally faster, all things considered, and definitely easier to use, upgrade, and maintain.


But most of the linux kernel hackers run Fedora (Linus) or work for Redhat (Dave). Linux specifically uses Fedora because it just works. I'm also not sure what you mean by faster? There's a lot of factors in play in terms of OS efficiency and benchmarking is hard.

Like I said, I do think that Ubuntu dominates small business, but large businesses still focus on RHEL. Cloud has just made numbers like number of installs easier to track. I wouldn't necessarily say that number of installs really even matters compared to what those installs are being used for, and how much clout Canonical has in the Linux community versus Redhat. Redhat directly employees a ton of critical people in standard Linux projects like freedesktop etc.


I work on both Debian and RedHat systems, I've found Debian to have better fit, finish and polish, and better documentation as well, for example Debian continues to ship init.d scripts as a backwards compatibility mechanism. Apt also for the most part is idiot proof, that helps too.

> I'm not really sure who the target audience of debian is.

A part of their core users are admins that don't get paid by the hour.

And since we're being controversial, any good admin should run debian (or maybe ubuntu) if they had the choice, if only because you can run the same system on your laptop and on the servers. Watching someone manage their Redhat/centos servers from OSX (or even Windows) is just painful to watch, especially if something is wrong or down (which is usually when you need an admin the most).

> what debians end goal is.

There is no end goal, it's simply a group of people making a distribution of software they use themselves. It's partly a matter of convenience.


Debian is far and away the most stable distro out there, arguably more stable than RHEL. Many people and organizations, including serious research facilities use Debian for this very reason. It's also free as in freedom/free as in beer. Best of both worlds. Scientists already have the chops to run Linux, and Debian allows easy tinkering and customizable approaches than does RHEL/CentOS. It's also easier to upgrade and manage. An in-place dist-upgrade is a breeze in Debian. Not so with RHEL/CentOS. I've done several of both, and Debian wins here hands down.

> Watching someone manage their Redhat/centos servers from OSX (or even Windows) is just painful to watch, especially if something is wrong or down (which is usually when you need an admin the most).

Why ? (Genuine question, I've been doing that for years (from Windows), never had a particular issue with it, even during crises. Not that I had the choice, if I did I would probably have installed Linux.)


The subtle differences will bite you at the worst possible time. Things like a rarely-used command flag doing something different.

I'm more of a developer than a sysadmin, but whenever I've had to use a different distro/OS on my development machine from what the servers were running, eventually I've hit a bug that only occurred on the servers and couldn't be reproduced on my development machine. And sure it's never impossible to diagnose and fix those bugs, but it's a bunch of extra effort that you just don't have to make if you're running the same distro everywhere.


Oh I see, thanks for your answer. It's mostly a developper issue for you then, and it's your sysadmin job to provide you with the right tools ; but he probably won't take "just install debian everywhere" as a valid suggestion. A sysadmin just needs putty and then he'll build his infrastructure to make his life easier. He will never write or run code on his own machine, but on a machine similar to the production target. And he will rarely write code himself to begin with.

You can ask your sysadmin for a FAAS (such as jupyterhub, or anything that you like) to be deployed on machines that are iso-prod. You can also ask him to provide a proper ci-cd toolchain (start small with the minimum, git to build to test deployment to unit tests). And of course matching your dev machine with the production will be beneficial if you can't get a FAAS, but your native OS may still be windows or macos, simply you will use virtualization or docker to make it compatible.

As a sysadmin I face this problem a lot here. I work with python or R dev and we are providing them with server-side IDE (RStudio and JupyterHub), otherwise they will have something running on their windows laptop (provided by external teams, we have no control over that nor the right to change it), and expect it to run the same in production.

But as a sysadmin I don't face the problem myself, because I don't develop much, that's why I didn't get the initial comment.

Edit : just to clarify a point. Just a few months ago when I arrived, it was indeed really painful to see people write python or R code on their windows machines, and not be able to run their code on the server. Some of them even asked to have windows servers to get rid of that issue. That's the type of miscommunication that I'm talking about, because on the other side of the email, someone gave them the windows servers, not understanding/caring about their actual need.

It was obviously the wrong solution but, in desperation and with the incoming deadlines, it became the right solution. So the problem isn't CentOS vs Debian vs your laptop, it's really about consistency in the toolchain and pipelines. The exact tool doesn't even matter (as long as it works).


VMs, remote IDEs, or FAAS are ways to develop on a server-like environment, but they all come with their own overheads; IME none has as good a debugging experience as being able to run the thing directly on your normal dev machine. (even for sysadminly scripting tasks, working via a putty window carries overhead and there are times when you'd rather be on the machine itself, though I'd agree that it's less common for it to be an issue there than for development work).

You're right that the issue is consistency rather than the particular OS. But there is a real point here that Ubuntu in particular is functional as a first-class OS both on developer laptops and on production servers, in a way that none of the other options can quite match. (Windows' administration support/automation is lacking on the server, OSX servers barely exist, CentOS is outdated on the laptop, Debian is too principled to support laptop hardware, Fedora is undersupported on the server...). So it does make for a unique selling point for Ubuntu in practice.


Apart from having to hunt down things like "putty" off the Internet (Why am I installing a binary directly from the web that I'm going to use to directly type passwords in to?),

and not even touching on exotic stuff like suddenly being in dire need of a serial terminal ("Wait, GNU Screen does that? I already have that installed!"),

it can range from simple stuff like having "man mdadm" available right there on your laptop because the wifi is spotty, to searching for stray/mysterious files by installing the same software on you laptop and doing a locate (because locate might not be installed or running "updatedb" might be inappropriate to run at that time on the server).

It's a thousand little things like that that make me feel like a fast typist/programmer who is watching someone hen peck entry forms when I watch someone administer Unix systems on Windows (and OSX, but to a lesser degree).


I'm going to play devil's advocate here, but when you download debian it's mostly a binary from the web that you're going to type passwords into.

But I totally get your other points. See my other answer above to get a partial solution. I think the main issue overall is the lack of dialogue between sysadmins and devs. I'm a sysadmin in a dev team and we don't have that issue together, because we are deploying the right tools and making it easier for everyone. But whenever we talk to the "others" sysadmin, I see a lot of frustration on both sides because they simply don't talk the same language.

Those "others" sysadmins provide the whole underlying infrastructure, and it's "latest CentOS" and nothing else. They don't have time to explore, maintain and secure other options. They don't have time to be more specific about the tooling they can provide. They just give cpu/ram/storage in the form of a VM that you order from a ticketing system or a self-service portal. And then they spend their day rebooting unresponsive VMs and opening network incidents. So they couldn't care less about your python module that's not the exact same version as your laptop. (Heck, they don't even provide R or python, we have to build it and package it ourselves and include dependancies).

So they get frustrated, and the devs get frustrated, and nobody can ever work together properly. Because the right strategy is not in place. But this has nothing to do with your laptop being this or that OS or the servers running CentOS instead of Debian. It's about your pipeline from your POC to your production code, and how people before you have tackled the issue (hint : most of the time they didn't).


> any good admin should run debian (or maybe ubuntu) if they had the choice, if only because you can run the same system on your laptop and on the servers

Why is Debian special in that respect? I once worked at a place where we ran CentOS on our workstations, and the correseponding RHEL on servers, for exactly this reason.


I think Debian is so good in its multiple roles because it's used in all these roles by its various maintainers. I don't think there has ever been a Debian "server edition" or a "desktop edition", for example.

I've always felt RH/Centos were a little less flexible when it came to desktop/laptop setups.


Watching someone try and plug their linux laptop into a projector is more painful.

I did a Deb to CentOS switch for an HPC cluster last year, I have to say the docs for CentOS, produced by RedHat for RHEL, are great and a big factor that you don’t mention

> In actual buisness dealings Redhat has always led the way in enterprise

Not going to rephrase what others have said but wanted to add a funny piece of anecdote: Long time ago when I worked for telcos many customers required a RedHat professional license for any piece of infrastructure running Linux. We therefore used to license one RedHat server for each of the deployed server running Debian :-p


Never ever heard of your description of RedHat and why it would be a good reason to use them.

10-15 years ago i tried Red Hat / Fedora and it wasn't great. I than also tried debian and it just worked and was easy to use.

Now, i'm not looking for support much. I prefer the base stability of debian/ubuntu. Haven't looked into RedHat a long time and i also don't see any big reason for it.


Fedora has come a long way in the past 10-15 years. It's a very stable GNOME workstation distro that tends to use bleeding-edge stuff. They also have some neat ideas (like Silverblue) that are fun to play around with.

I've also found dnf/yum to be overall nicer to use than apt, the history feature is especially useful too.


> I've never understood the appeal of Debian or Ubuntu, or I guess I don't understand why people would invest so much time in them compared to Fedora and Redhat.

For historical reasons. Debian was around before Fedora, CentOS, and Red Hat Enterprise (the last one I'm not 100%). I also remember growing up thinking Red Hat had a more enterprise focused community, whilst debian was more for the masses. Wrong perception? Perhaps, but a perception nonetheless. Practically everyone I know that used Linux back then did it also for ideological reasons tied to the free software movement. Red Hat was a company, Debian was a movement. It had a social contract: https://www.debian.org/social_contract.1.0 which aligned with most of the free software community's ideals back then. This was a time when not licensing your project under GPL and instead choosing a BSD/MIT style licence was frowned upon because "something something Microsoft BSD TCPIP stack you see?".

I imagine everyone has their reasons, mostly subjective, but the first time I got into GNU/Linux, circa 1999 with Red Hat 6 (pre-fedora days), and Debian was not yet on my radar. Fast-forward a year, and I installed Debian (I think it was "potato") on my machine, and fell in love almost instantly thanks to apt, their ethos, and just the way the entire distro was designed (relatively easy switching from stable, to testing, to edge...). Mind you, I was a very novice user and not that tech savvy, but I was (I still am) a "believer" in the free software movement. I only had a 256kbps internet connection and used to download packages sporadically. The Debian CDs came with quite a good number of packages, and I remember magazines that distributed CDs with entire repositories (you could add your CD as a repository source).

Fast forward a few years, and I had left Debian for Slackware, then FreeBSD, then Gentoo (they lured me with the "portage is like ports" hype), and then Arch Linux. I always kept my respect and admiration for the good experience I had with Debian, as a user. I never felt that for Red Hat. What always took me from one distro to the next was tooling. I'd say tooling is the number one aspect of user experience in linux distributions for a certain type of user which tends to be less tech savvy than the rest. People want to install packages that auto-configure, and then tweak if necessary. Debian gave you that (sort of). You even had unattended upgrades. Back in the early 2000s that was a real "OMG" feeling for some, myself included.

> I've also never understood the appeal of .deb or apt.

The same reason Homebrew is such a success in macOS, or chocolatey for Windows.


RHEL isn't worth the time as it is broken by design. It only includes the minimum packages for the OS relying on 3rd party repos for everything else and those 3rd party repos are generally terrible. The model is that they supply the core OS and you supply everything else, but I've never seen that in practice.

Fedora might be OK, I've never used it and I've never known anyone who used it either. It always seemed more fringe than Debian and all it's child various distros.


> Since Debian developers are famously an agreeable and non-argumentative bunch, there should be no problem with that aspect of things.

Is this sarcasm? I don't read LWN enough to know if they are dryly joking and I don't read Debian listserv's to know if it's true or not.


Before, when Debian Wheezy was the current stable version, I would have read this and assumed no sarcasm and agreed wholeheartedly. But these days, I'm with you, I just can't tell.

Given the recent article by Michael Stapleburg[1] that made it to the front page of HN the other day and now this it seems like things are a bit rocky at Debian these days.

A shame too because even though I don't use it that much any more I still consider it one of, if not the, most important Linux distributions available. Hopefully this is just a glitch that will get sorted sooner than later!

[1] https://michael.stapelberg.ch/posts/2019-03-10-debian-windin...


I actually think things are a lot less rocky these days - nothing in Michael Stapelberg's post is new, it's just a difference of scale and perhaps the developer population getting a bit older and more committed to their day jobs. There are a lot of things helping with scale: Alioth (running a fork of SourceForge's code) was replaced with Salsa (running GitLab), which effectively collapsed the options for version control from CVS, SVN, Git, Darcs, etc. to just Git. Debhelper 9 standardized a ton of workflows. Team maintenance is a lot more common than individual maintenance, and uploading packages owned by unresponsive maintainers has gotten a lot more socially acceptable. But a lot of the old problems have scaled more quickly than solutions: if you're doing an archive-wide change, there are more packages you need to touch, more people you need to contact, etc., so each old problem happens more.

(I also don't think any of the problems in that post are about developers being disagreeable or argumentative - just about Debian having a lot of technical debt and no easy way to pay it down.)


Fair enough and based on what I read yesterday your comments about problems scaling faster than solutions makes a lot of sense. You seem to be more on top of it then me.

As someone who no longer uses it but has a lot of history using Debian I hope you are correct and that this latest issue of not finding a leader according to the standard schedule is just something they route around. Based on the tone of the end of today's article it sounds like that's the case.


Yes it is sarcasm

I know how loved Debian has been for the community and all that has been built on it but as a user I kind of always felt that Debian was always doing their own thing, at their own pace and their own requirements. Maybe this is a sign that that way of running a Linux community is over? We really have a fairly robust ecosystem and development has become much more refined? I might be missing the inner politics of the community that has caused this?

Can you expand on your comment a bit? Maybe I've got rose colored glasses but I felt Linux had more of community feel ~10 years ago. When Ubuntu was ascendant but still had that scrappy feel, lots of Gentoo users were funrolling loops, and Red Hat wasn't owned by IBM.

There's also the problem (IMHO) of OSX having pulled otherwise Linux-first developers away. I feel an especially heavy effect on desktop efforts like Wayland and Systemd.

Some time since then, things went a bit "weird".


Maybe this is an effect linux ecosystem fragmentation of recent years has on communities.

I would say the opposite. Things have been consolidating in companies and in technologies. It used to be that you had 10 options for everything. Now things are getting more and more standardized. I like to think we are much better today in Linux than when we were 10 years ago. You use to pick out what applications were important and see what version was supported by what distro. That isn't a thing anymore.

I think the linux community is now less about the distro, and more about what runs on the distro.

The latest trends in software development (containers, kubernetes) are just moving too fast for Debian to keep up, IMO. I think we're going to see the rise of 'purpose-built kubernetes' distros.


I would hope an Amazon, Google, Digital Ocean, Rackspace, or other major vps player could get someone involved for an oss project critical to their business.

I guess as a user I frequently take this for granted.


There isn't (to my knowledge) a large enterprise-supported version of Debian like Red Hat / CentOS. You could argue Ubuntu, but Ubuntu is significantly different in focus to Debian, while Red Hat is almost identical to CentOS.

Debian also, unjustly, suffers from its non-corporate image/reputation.

Most of my professional sysadmin career has been spent managing RPM-based Linux distributions. The time I was asked to netboot and automatically configure multiple Ubuntu/Debian/Mint workstations, I found that there just wasn't the breadth of netboot/kickstart-type documentation available on Debian based systems. The smaller number of users doing that sort of thing led to less documentation. (I'm not saying that is definitively the case nowadays, it's just that lots of large scale deployments mean a lot of documentation and projects to support such activities)


The relationship between Debian and Ubuntu is more akin to that between fedora and RHEL.

Having a different focus than fedora, in fact, one could say almost radically different[1] hasn't stopped Redhat from contributing people, hardware and support to fedora -- leaving the volunteers to focus on contributions -- be it by code or as you point out documentation.

Help with 'administrative' or grunt-work tasks goes a long way in large OSS projects and Ubuntu/Cannonical simply haven't stepped up where they should have.

[1] for business vs for developers, cutting edge vs stability, ..etc


Red Hat have been pretty smart with the whole Fedora/RHEL/CentOS progression. You get to mess with cutting edge stuff in Fedora (and Fedora Silverlight), and then take what works to fold into RHEL. Then you rebuild RHEL into CentOS so people use the free and binary compatible version of your product, which leads them to use the supported/sold version in production.

Why not Canonical/Ubuntu ? After all they are presumably the most affected by the state of Debian. Just a couple of days ago I made a comment along these lines in a different thread:

https://news.ycombinator.com/item?id=19353816

  > While this is sad and painful to read, I can't say I'm surprised.
  > 
  > The problems listed are precisely the kind of problems that Redhat
  > strategically supports fedora with, in terms of investment of resources.
  > For all the hate Redhat receives it has consistently been a good
  > community member by being willing to help fedora in areas that it knows
  > are hard and yet not 'cool' enough to attract volunteer contributions.
  > 
  > What has Ubuntu done for the debian community along the same lines?
To further contrast this difference, here's what Redhat does for fedora and OSS in general:

https://fedoraproject.org/wiki/Red_Hat_contributions

https://getfedora.org/sponsors

versus:

https://db.debian.org/machines.cgi

https://www.debian.org/partners/

https://www.debian.org/mirror/sponsors

Of course, that doesn't stop 'em from throwing in lip-service:

https://www.ubuntu.com/community/debian

https://www.canonical.com/projects


I would hope someone could set me straight if I'm wrong, but I thought that the relationship between Canonical and Debian was mostly one of taking rather than giving back?

You're right about Red Hat though, they get a lot of negative feeling from people who may not understand just how much they give back to the community at large.


I think it depends a lot on the relationship between the people who work on some party of the distribution. The canonical people are paid and have their todo list, and they are paid to do that. So if they have a good contact to Debian, they try to get things done via Debian. Some even don't care at all. Some are even Debian developer and do all teir work in Ubuntu and just upload something to Debian at some point. Or they try to force their init system into Debian.... So there are a lot of positive things, but when I think about the biggest issues we had in Debian (or actually have), I consider Ubuntu as evil.

Amazon has Amazon Linux which is based on CentOS, and Google has invested in CoreOS which also based on CentOS just for K8s. I don't know if other vendors like DO would invest in any specific distro tbh.

Additional data point: The Linux that runs Google corp Linux machines is a debian derivative:

https://en.wikipedia.org/wiki/GLinux


CoreOS was based on ChromeOS/Gentoo. With the Red Hat acquisition of CoreOS, it will be merged with Atomic and based on Fedora/RHEL.

I think GKE's 'cos' is based on the ChromeOS/Gentoo version of CoreOS.


I'd hope not. An organization like the Wikimedia Foundation or Apache would be a better fit.

I agree... Debian's corporate independence is a feature not a bug.

I can't see why they would, they don't use internally (afaik). Even Amazon has it's own (centos derived) version of linux for instances.

> The good news is that this possibility, too, has been foreseen in the constitution. In the absence of a project leader, the chair of the technical committee and the project secretary are empowered to make decisions — as long as they are able to agree on what those decisions should be. Since Debian developers are famously an agreeable and non-argumentative bunch, there should be no problem with that aspect of things.

Even if nobody puts their name this week or the next or the week after, and even with the sarcastic remark in the last sentence, it looks like this is not such a big deal after all.

> In other words, the project will manage to muddle along for a while without a leader, though various aspects of business could slow down and become more awkward if the current candidate drought persists.

Why should it muddle along or slow down? All the responsibilities listed (and linked to in a mail in the mailing list) seem to be things that others could take up, either individually or as smaller groups. It's something that could be worked out over a period of time.

> One might well wonder, though, why there seems to be nobody who wants to take the helm of this project for a year. The fact that it is an unpaid position requiring a lot of time and travel might have something to do with it.

The last sentence is probably the issue. Couldn't sponsorships help? The last thing we need is Debian to become corporatized.

Donate to Debian at https://www.debian.org/donations


> Couldn't sponsorships help?

As implied and cited by the article itself, money and Debian are uneasy bedfellows.

> The last thing we need is Debian to become corporatized.

I highly doubt this will be a likely outcome here.


I am switching to Devuan which is Debian without the service manager known as systemd (not trying to flame about systemd but it is no longer my preference) .

So far it has worked great for me on my test laptop and I will be rolling out some VM's shortly.


I run KVM VMs on devuan and could not be happier. (flame backspaced out, you're welcome...)

I've just browsed https://en.m.wikipedia.org/wiki/List_of_Debian_project_leade...

There are quite a few names I've never heard of, and some that I've heard of, but not as DPL.

The last six (after Hocevar) are total blanks for me.

Have my interests changed so much, or did the recent DPLs keep a much lower profile?


Chris Lamb is a remarkably active and outgoing DPL. But yes, I've been active in Debian for well over a decade and I've seen both a lot of people leave and new people that I don't know or get to know as I used to. All 5 DPL candidates ring a bell for me, though.

What whould be the case today anno 2019 to use Debian over CentOS, Ubuntu or one of the other distro's?

Debian is all over the place in embedded. It's pretty much the de-facto distribution (you see Yocto, Arch, Ubuntu server etc too) for consumer boards. Not sure what the distribution is in private industry, but typically manufacturers provide Debian images at a minimum. This is a problem for embedded systems because you're relying on a company to provide hardware-specific patches. While those are typically kernel patches (and so somewhat distro-agnostic), most people don't want the hassle of compiling their own kernel. That means you're likely going to be using Debian.

From my experience in private industry, almost everybody uses Yocto.

Security and stability, independence from financial interests of an owner.

Arguably rpm/yum is still a kludge compared to dpkg/apt.

It's not, although I would admit that it's painful to learn. I spent my fair share of time learning it, and can now understand what's wrong with my configs in a split second when it doesn't work. Overall it's faster and easier to search with than apt, from my (very small, I admit) experience with apt. The dependancy system works really well, even when messing around with upgrades/downgrades/uninstalls too.

And I admit that a few years ago I would rank apt as much better than yum. Matter of preference / habit I guess.


I agree about Yum as of RHEL7, but dnf/yum4 are actually quite good in my experience.

Could you expand on this or share pointers to any articles/posts on this?

I wonder how this will affect the Raspbian as default Raspberry Pi distro...

I love Raspbian on my Raspberries, but I've also had success with Arch. It would be a pain to have to switch away from Debian but at least there's a "Plan B". As for more inexperienced Raspberry users, there are other OSes but undoubtedly without the wealth of libraries etc that Raspbian has.

I wish it gets obsolete as it's an abomination.

How is it an abomination?

It seems to have served its purpose well so far. It is relatively easy to use and hasn't gotten in the way of anything I've wanted to use it for yet.


Am not the parent, but RPi is the only ARMv6 target of any significance; supporting an outdated architecture has non-trivial costs, which is why Ubuntu and other distros dropped the support.

It ships custom kernel and promotes doing things in a different manner that we are used to with Linux - boot.txt, overlays, rpi.gpio (even different gpio numbering scheme), raspicam and so on. Some of these are meant to make things easier for newcomers but IMHO it makes stuff much worse as you have a "special" rPi environment instead of standard Linux on ARM board environment.

You can run mainline on rPi just fine, with standard tools - the only problem is that most rPi projects on the internet won't work as they target Raspbian instead of Linux.


This could be a great opportunity for someone looking to step into a leadership role.

It is not: https://danielpocock.com/what-does-democracy-mean-in-free-so...

The current Debian "developers" have reached uniform agreed that:

1) Neither of them will participate in election

2) People outside of "developer" circle (including major contributors, who do most of actual work on Debian) can't participate in election

The whole "crisis" is artificially created. Not like that matters, — most of decision making is already done by corporate employees, not some fictitious "community leader".


Reading through those articles it indeed seems like a political nightmare. When you have both volunteers and paid members, politics will be inevitable, as those getting paid will do everything in their power to keep it, while those not getting paid will do everything they can to gain it. Just like in a startup, money can kill relations. Money is really the root of all evil. In any organization there will be politics, so as a leader you have to learn how to deal with it. You do not necessary need any power, as you do not lead by giving orders, you lead by showing. I think solving the inequity problem would be a top priority, maybe dedicate 50% of your own FAANG salary and give it to people that do the same job as you do, but doesn't get paid. It's probably a really bad idea, but it's just an example, the purpose is that others will follow.

Has the number of candidates been decreasing steadily, or has this happened out of the blue? If so, what has happened? Are there more responsibilities to this position, making it harder to do now than it was years ago? Is the interest in Debian dwindling?

> Is the interest on Debian dwindling?

On the contrary, the project is growing in terms of contributors, packages and internal projects.

https://reproducible-builds.org/ mostly came out from Debian (see https://wiki.debian.org/ReproducibleBuilds )

Despite the name "leader", the role of DPL is close to an ambassador with no power to tell other people what to. The DPL appoints a handful of roles, then hand-on work is delegated and done by other people.

The drama on blogs and mailing lists is in no way representative of how most DDs feel about the project.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: