I've never understood the appeal of Debian or Ubuntu, or I guess I don't understand why people would invest so much time in them compared to Fedora and Redhat. In actual buisness dealings Redhat has always led the way in enterprise (whole data centers), kernel development (a bunch of the maintainers work for them and they lead efforts like cgroups v2), and other highly technical pieces of the ecosystem, including acquiring all of the start developers (such as the systemd folks). Investing in Redhat is an investment into a deeper understanding of LInux through their excellent documentation and well paying positions through their famously good certifications.
There is of course a lot of debian and ubunutu out there in small deployments, usually individuals, small buisnesses, and education, but they are ultimately less lucrative and more oriented towards beginners/hobbyists. At least Ubuntu used to be. I'm not really sure who the target audience of debian is.
I've also never understood the appeal of .deb or apt, or what debians end goal is. They never seem to be doing anything that interesting except grinding away at repackaging everything. Canonical is just flailing to find some sort of niche and historically (Ubuntu phone, Mir, Open Stack, Upstart, probably snap) they are incredibly bad at finding relevance. I think having multiple packaging ecosystems is the stupidest idea in the world, and don't see the value add except giving people the illusion of more choices.
About 10 years ago I was responsible for some servers running Fedora and some servers running Debian. Fedora's update cycle was way too fast, but we also knew that switching to CentOS would get us software that was way too behind to be useful, and we'd be responsible for building a lot of stuff in /usr/local ourselves.
A small technical difference is that the dpkg ecosystem has a feature called "diversions" that let you cleanly say that you want to replace a packaged file. A config management system can use this to avoid having config management and the package manager fight each other. I'm the current maintainer of https://github.com/sipb/config-package-dev , a tool that lets you build Debian packages that express system configuration, and I've ended up using it at 3 of my last 4 jobs. For the Fedora servers, we were using some mix of checking out /etc from Subversion and rebuilding packages on our own. We surveyed our options and determined that we weren't really better off with any of the "real" config management options at the time (to be fair, modern config management hadn't really taken off then) and the lack of diversions meant we were better off rebuilding packages that we wanted to make changes to.
In diversions, a package claims a file. When anyone else tries to overwrite the same file, it's diverted elsewhere. It isn't designed for the user to choose between equals.
I built a derivative distro using diversions mainly to prevent "soft-forking" Debian packages as much as possible (maintaining packages is extremely costly for anyone that isn't a decentralized organization like Debian)
I have no incentive to ever install anything else than Ubuntu on my laptop given the investment I've done to get used to it. And on servers, I use Ubuntu just because I know it.
By far the best cli installer with zypper and that they make it super easy to build and install with their own build system and uses repos is just best in class. All done online and can be one click just makes me wonder why the community still avoids them.
The idiocies of youth. Now I kinda wish I'd made that commitment. Must check it out again.
If anything, Red Hat should have adopted apt.
In terms of sheer number of dumb competing initiatives, Ubuntu > Red Hat > Debian.
There's a very complex interaction between the various files which define a package. It's extremely hard to find out what debhelper and its associated commands are actually doing and to debug them, particularly if you're trying to split a package into multiple parts. I gave up several times trying to make my package before finally succeeding with a simplified version. I'm not sure I will understand how it works when I go back to it.
On top of that the sheer volume of rules and regulations, different packaging helpers and out of date suggestions just added to the problem.
Deb has far too much magic. Rpm is much clearer. I can define a simple configuration file which takes the source, patches it, builds it, and describes the packaged files. I've never had a problem understanding an rpm source package.
As a sysadmin, of one box or 100k boxen, I'd have a hell of a lot less apprehension about installing a DEB than an RPM. Thanks largely to Debian Policy, which governs how packaages will behave.
The fact that Debian has an order of magnitude more in-repo, in-distro packages also counts for something.
As well as the fact that I can start with a minimal installation and add to it only what the box actually needs, a process Red Hat and RPM have never adequately supported.
Source: Have run both, personally and professionally, for nearing a quarter century.
Nowadays the competitive edge of APT could have diminished, but I never looked else where because I have been quite happy with APT.
Once I needed a package from an external repository, apt has almost always fallen down into a pit of duplicate and incompatible dependencies. Maybe it’s better now, but I was burned long ago and never looked back.
Portage is cute but it's tied to Gentoo and few people want to compile their own packages.
Apt was one of the earliest package managers for Linux, for a popular distribution. That was enough to give it a head start regarding packages available. And that's all you need, unless the other solutions are obviously superior from the user perspective, which they're not.
Strongly agree. Gentoo's Portage is the best experience I've had with a package manager on Linux.
My package management observations, as someone who's been using them for... a while, now.
Best experiences: Portage on Gentoo (lots of options; many packages; fast; reliable unless I broke it doing something stupid; maintaining a high level of control over the system at a low level is the only way I've ever managed to make Linux behave itself), Homebrew on MacOS (great package selection; casks are wonderful; outstanding CLI; keeping user packages away from the core system has really grown on me)
Middling: apt on Ubuntu and Debian (OK but increasingly feels dirty after getting used to Homebrew and keeping my user packages separate from the system; breaks "on its own" more often than Portage did, though still rarely; package selection just OK)
Bad: Yum/RPM on Red Hat, Fedora, and long ago, RPM on Mandrake (sloooooooowwwwwwwww and RPMs seemed to break constantly back when I used them), Macports on MacOS (slow; broke so bad the easiest thing to do was to delete the whole dir and start over about once a quarter; limited package selection)
Incidentally, Gentoo's OpenRC is the only Linux init system I've ever used that I liked and felt like I actually understood well enough to use it without apprehension or constantly needing to check documentation.
I like the idea of Nix and tried NixOS once but after an hour or so of following a guide and troubleshooting couldn't get X running so gave it up as "for those with more free time than me—try again in a few years".
I try about 1-2 other distros every year (in VMs or a spare laptop), and I always stay with Gentoo. I won't go so far as to say that it's objectively better, but after 10 years or so just nothing else compares for me.
Using RH easily led you into an "rpm hell" of packages from various third-party repos that conflicted: https://en.wikipedia.org/wiki/Dependency_hell#Platform-speci...
Now they're both comparable. Yum/dnf are ok, the rpm repos are big. And apt actually has "apt", the tool, that consolidates those separate tools you mention. Before it had aptitude (since at least 2006 or so, I'd say) but it wasn't used much, but now apt is installed by default and you can use it. No more "apt-get", "apt-cache", etc.
So today, nothing makes one of them substantially better, IMO.
As a consequence, Debian packaging policy is (in my experience) a lot more aggressive about using detailed dependency-resolution metadata to establish constraints and define compatibility, which makes for a more annoying developer experience but a nicer end-user experience.
I started using it before 2005, before backports was a thing, and it was a lifesaver for my desktops, as I usually wanted to use newer versions from testing or unstable.
For apt as a format probably not so much anymore. For the Debian apt repositories it's the several thousand capable and driven maintainers.
A) of software - DFSG
B) of decisionmaking - Debian constitution
C) of choice
* both installed program and source is just an apt command away (A, C)
* has a large repository of software (historically largest; nowadays it's still among the largest) (A, C)
* security support for stable release for the whole archive (cf. Ubuntu, RHEL) (A)
* software from repository aims to play nice together (C)
* true community distro, independent from any single vendor (B)
* multiple kernels, multiple desktop environments, multiple init systems, a truckload of architectures (C)
My fear though, is Debian is not going to be able to keep up with the pace of commercial linux distributions. Things are moving at light speed these days.
A good example is the Initramfs-tools that debian created. Before (most, all?) distros just had a bunch of shell scripts mostly concatenated together, but hard coding that for every possible configuration that Debian allows was painful for the maintainers so they created initramfs-tools. They did not have a bunch of consultants which could sell handcrafted solutions to customers.
Red Hat decided many years later to create a similar solutions, but for "reasons" did not want to port initramfs-tools and created Dracut. Debian is now considering porting dracut since maintaining initramfs-tools is a lot of work which could be spent elsewhere. The debconf bof on that was basically the maintainer asking "OK, anyone interesting in continuing maintaining initramfs? How did the test port of dracut go?".
The incentives and politics in a community project are very different from those within a company. Not always better for everybody, but very distinct.
I would also guess that the small number of developers that work within such niche area was basically all hired by redhat during the earlier days. Not with nefarious intention mind you but simply out of good business practice.
Notably after running Debian on my servers for years, I tried getting the same setup on RHEL to see how it compared - starting with getting nginx running. On debian it was "apt install nginx". on RHEL, nginx wasn't available from the main repo so I had to use an unofficial third party. The third party binary didn't work so I tried to compile from source. Compiling from source failed because some libraries were missing. Those libraries also weren't in the rhel repo. The third party repo had some, but they were out of date. Trying to compile the libraries from source, one of them used Scons rather than Autoconf, so I tried to install that. RHEL didn't even include the basic popular (at the time) build tools, so I decided to give up rather than compile my entire stack from scratch bypassing the package manager :P
So I read all of the holy wars about APT vs. RPM, Unity vs. whatever, etc. I can imagine that LTS support is important to a lot of people.
But honestly, for me I just spent 10 years learning Ubuntu on my local machine. So in a situation where a developer like me gets to influence which OS will be used on the servers (a situation that happens surprisingly often!), I'll generally go with what I know. I suspect that this is the origin story for most of Ubuntu's base.
You wanted Firefox. Everyone wanted Firefox, pretty much. Obviously. Yet other distros liked to package up Epiphany or some such weirdness when you let them give you the deluxe desktop software installation. Same for lots of other categories of desktop software.
Ubuntu's defaults and the options they pushed to the front were consistently the right ones, for the time. I can only assume politics and oddball personal preferences were behind every other distro getting that so wrong for so long. And yeah it didn't matter that much if you were fine picking your own packages, but Ubuntu tested with their defaults and things worked pretty smoothly with them, plus they were mostly the programs you'd have chosen yourself, anyway. I've never seen another distro where I could just say "gimme the desktop, the whole shebang" and be more or less ready to get going when it was done—all the others it's "no, god no, give me barebones and I'll install what I want".
This is early Ubuntu, mind you. I stopped using it (and, for the most part, desktop linux generally) when they brought in PulseAudio and wrecked audio—previously among the most reliable things on my Linux systems, for almost a decade—and I basically rage-quit Linux.
From my experience, Ubuntu is significantly more developer friendly and better documented than Red Hat products. Linux is used in a huge variety of ways, and in my experience Ubuntu is better at accommodating a plurality of contributors from different affiliations. Canonical also funds lots of critical Linux infrastructure work - perhaps not as much as Red Hat, I haven't compared. But my point is that I think they fund more useful things, and are better at building what feels more like a community - which yields massive dividends down the line.
This is not to say that Ubuntu is without flaws. Even with PPAs, packaging for Ubuntu/Debian is a massive pain, the recent APT vulnerability showcases shortcomings in their infrastructure, and Canonical thrashes on a lot of technologies that they really shouldn't thrash on. But in the long run their strategy is better at attracting developers, and I don't think you need to worry about them finding relevance.
Also I have never got on with yum. I still don't. It's a mystery of a program compared to apt or pkg.
Learning RHEL would've been useful (but not essential) in my career, but 20 years ago that wasn't obvious.
Also, the Redhat docs have been great training material, and I think they are worth paying money for.
Debian/Ubuntu are used in surprising places, though, as you likely know. Ubuntu and Raspbian on a Pi are great at prototyping stuff. I've used Raspberry Pis for professional projects, and at my last job, a couple of Pis running Pi-hole to blacklist ads, beacons, and trackers for a public library that didn't have a lot of money in their budget for a more "proper" solution.
The accidental (?) irony in this is just awesome :)
As others have pointed out, APT played a critical role in Debian's popularity as open source software grew and early naive software distribution methods were outpaced by complexity.
I won't weigh in on Canonical's broader presence across verticals, technologies, etc., but Ubuntu is widely recognized as the most popular Linux in cloud platforms. I believe this is a direct result of APT coupled with a more rapid release pace.
RPM-based distros remain extremely valuable and extremely popular in many scenarios, particularly when someone has to run a product from a proprietary ISV or something that has a very complicated certification model.
In some cases, but certainly not all and in my opinion in a decreasing amount, the choice of distro might be tied to how active a particular company is in the upstream kernel. "Buy Red Hat because they contribute to the kernel". Suffice to say as decision makers perceive that their app isn't the kernel, that decision criteria is shifting (thankfully, many RPM-based vendors including Red Hat have many other value items to add)
While Red Hat has determined (sadly) the overall direction of Linux development with things like systemd (don't get anyone started on this one), PulseAudio, et al, does anyone not think this will not end up tragically?
Independent distros like Debian (who sadly adopted systemd) and Slackware (now the oldest continually developed distro, and thankfully not using systemd) need to exist. Linux hackers, both pro and amateur, need a place to tinker that is not always being broken. Both Debian and Slackware are fairly conservative distros that have a long development time. I for one favor this long development time for things I need to be rock solid.
Look at the development space as well as PaaS, Web hosting, etc. Ubuntu dominates this space. With one exception: people that need CPanel. Ubuntu is generally faster, all things considered, and definitely easier to use, upgrade, and maintain.
Like I said, I do think that Ubuntu dominates small business, but large businesses still focus on RHEL. Cloud has just made numbers like number of installs easier to track. I wouldn't necessarily say that number of installs really even matters compared to what those installs are being used for, and how much clout Canonical has in the Linux community versus Redhat. Redhat directly employees a ton of critical people in standard Linux projects like freedesktop etc.
A part of their core users are admins that don't get paid by the hour.
And since we're being controversial, any good admin should run debian (or maybe ubuntu) if they had the choice, if only because you can run the same system on your laptop and on the servers. Watching someone manage their Redhat/centos servers from OSX (or even Windows) is just painful to watch, especially if something is wrong or down (which is usually when you need an admin the most).
> what debians end goal is.
There is no end goal, it's simply a group of people making a distribution of software they use themselves. It's partly a matter of convenience.
Why ? (Genuine question, I've been doing that for years (from Windows), never had a particular issue with it, even during crises. Not that I had the choice, if I did I would probably have installed Linux.)
I'm more of a developer than a sysadmin, but whenever I've had to use a different distro/OS on my development machine from what the servers were running, eventually I've hit a bug that only occurred on the servers and couldn't be reproduced on my development machine. And sure it's never impossible to diagnose and fix those bugs, but it's a bunch of extra effort that you just don't have to make if you're running the same distro everywhere.
You can ask your sysadmin for a FAAS (such as jupyterhub, or anything that you like) to be deployed on machines that are iso-prod. You can also ask him to provide a proper ci-cd toolchain (start small with the minimum, git to build to test deployment to unit tests). And of course matching your dev machine with the production will be beneficial if you can't get a FAAS, but your native OS may still be windows or macos, simply you will use virtualization or docker to make it compatible.
As a sysadmin I face this problem a lot here. I work with python or R dev and we are providing them with server-side IDE (RStudio and JupyterHub), otherwise they will have something running on their windows laptop (provided by external teams, we have no control over that nor the right to change it), and expect it to run the same in production.
But as a sysadmin I don't face the problem myself, because I don't develop much, that's why I didn't get the initial comment.
Edit : just to clarify a point. Just a few months ago when I arrived, it was indeed really painful to see people write python or R code on their windows machines, and not be able to run their code on the server. Some of them even asked to have windows servers to get rid of that issue. That's the type of miscommunication that I'm talking about, because on the other side of the email, someone gave them the windows servers, not understanding/caring about their actual need.
It was obviously the wrong solution but, in desperation and with the incoming deadlines, it became the right solution. So the problem isn't CentOS vs Debian vs your laptop, it's really about consistency in the toolchain and pipelines. The exact tool doesn't even matter (as long as it works).
You're right that the issue is consistency rather than the particular OS. But there is a real point here that Ubuntu in particular is functional as a first-class OS both on developer laptops and on production servers, in a way that none of the other options can quite match. (Windows' administration support/automation is lacking on the server, OSX servers barely exist, CentOS is outdated on the laptop, Debian is too principled to support laptop hardware, Fedora is undersupported on the server...). So it does make for a unique selling point for Ubuntu in practice.
and not even touching on exotic stuff like suddenly being in dire need of a serial terminal ("Wait, GNU Screen does that? I already have that installed!"),
it can range from simple stuff like having "man mdadm" available right there on your laptop because the wifi is spotty, to searching for stray/mysterious files by installing the same software on you laptop and doing a locate (because locate might not be installed or running "updatedb" might be inappropriate to run at that time on the server).
It's a thousand little things like that that make me feel like a fast typist/programmer who is watching someone hen peck entry forms when I watch someone administer Unix systems on Windows (and OSX, but to a lesser degree).
But I totally get your other points. See my other answer above to get a partial solution. I think the main issue overall is the lack of dialogue between sysadmins and devs. I'm a sysadmin in a dev team and we don't have that issue together, because we are deploying the right tools and making it easier for everyone. But whenever we talk to the "others" sysadmin, I see a lot of frustration on both sides because they simply don't talk the same language.
Those "others" sysadmins provide the whole underlying infrastructure, and it's "latest CentOS" and nothing else. They don't have time to explore, maintain and secure other options. They don't have time to be more specific about the tooling they can provide. They just give cpu/ram/storage in the form of a VM that you order from a ticketing system or a self-service portal. And then they spend their day rebooting unresponsive VMs and opening network incidents. So they couldn't care less about your python module that's not the exact same version as your laptop. (Heck, they don't even provide R or python, we have to build it and package it ourselves and include dependancies).
So they get frustrated, and the devs get frustrated, and nobody can ever work together properly. Because the right strategy is not in place. But this has nothing to do with your laptop being this or that OS or the servers running CentOS instead of Debian. It's about your pipeline from your POC to your production code, and how people before you have tackled the issue (hint : most of the time they didn't).
Why is Debian special in that respect? I once worked at a place where we ran CentOS on our workstations, and the correseponding RHEL on servers, for exactly this reason.
I've always felt RH/Centos were a little less flexible when it came to desktop/laptop setups.
Not going to rephrase what others have said but wanted to add a funny piece of anecdote: Long time ago when I worked for telcos many customers required a RedHat professional license for any piece of infrastructure running Linux. We therefore used to license one RedHat server for each of the deployed server running Debian :-p
10-15 years ago i tried Red Hat / Fedora and it wasn't great. I than also tried debian and it just worked and was easy to use.
Now, i'm not looking for support much. I prefer the base stability of debian/ubuntu. Haven't looked into RedHat a long time and i also don't see any big reason for it.
I've also found dnf/yum to be overall nicer to use than apt, the history feature is especially useful too.
For historical reasons. Debian was around before Fedora, CentOS, and Red Hat Enterprise (the last one I'm not 100%). I also remember growing up thinking Red Hat had a more enterprise focused community, whilst debian was more for the masses. Wrong perception? Perhaps, but a perception nonetheless. Practically everyone I know that used Linux back then did it also for ideological reasons tied to the free software movement. Red Hat was a company, Debian was a movement. It had a social contract: https://www.debian.org/social_contract.1.0 which aligned with most of the free software community's ideals back then. This was a time when not licensing your project under GPL and instead choosing a BSD/MIT style licence was frowned upon because "something something Microsoft BSD TCPIP stack you see?".
I imagine everyone has their reasons, mostly subjective, but the first time I got into GNU/Linux, circa 1999 with Red Hat 6 (pre-fedora days), and Debian was not yet on my radar. Fast-forward a year, and I installed Debian (I think it was "potato") on my machine, and fell in love almost instantly thanks to apt, their ethos, and just the way the entire distro was designed (relatively easy switching from stable, to testing, to edge...). Mind you, I was a very novice user and not that tech savvy, but I was (I still am) a "believer" in the free software movement. I only had a 256kbps internet connection and used to download packages sporadically. The Debian CDs came with quite a good number of packages, and I remember magazines that distributed CDs with entire repositories (you could add your CD as a repository source).
Fast forward a few years, and I had left Debian for Slackware, then FreeBSD, then Gentoo (they lured me with the "portage is like ports" hype), and then Arch Linux. I always kept my respect and admiration for the good experience I had with Debian, as a user. I never felt that for Red Hat. What always took me from one distro to the next was tooling. I'd say tooling is the number one aspect of user experience in linux distributions for a certain type of user which tends to be less tech savvy than the rest. People want to install packages that auto-configure, and then tweak if necessary. Debian gave you that (sort of). You even had unattended upgrades. Back in the early 2000s that was a real "OMG" feeling for some, myself included.
> I've also never understood the appeal of .deb or apt.
The same reason Homebrew is such a success in macOS, or chocolatey for Windows.
Fedora might be OK, I've never used it and I've never known anyone who used it either. It always seemed more fringe than Debian and all it's child various distros.
Is this sarcasm? I don't read LWN enough to know if they are dryly joking and I don't read Debian listserv's to know if it's true or not.
Given the recent article by Michael Stapleburg that made it to the front page of HN the other day and now this it seems like things are a bit rocky at Debian these days.
A shame too because even though I don't use it that much any more I still consider it one of, if not the, most important Linux distributions available. Hopefully this is just a glitch that will get sorted sooner than later!
(I also don't think any of the problems in that post are about developers being disagreeable or argumentative - just about Debian having a lot of technical debt and no easy way to pay it down.)
As someone who no longer uses it but has a lot of history using Debian I hope you are correct and that this latest issue of not finding a leader according to the standard schedule is just something they route around. Based on the tone of the end of today's article it sounds like that's the case.
There's also the problem (IMHO) of OSX having pulled otherwise Linux-first developers away. I feel an especially heavy effect on desktop efforts like Wayland and Systemd.
Some time since then, things went a bit "weird".
The latest trends in software development (containers, kubernetes) are just moving too fast for Debian to keep up, IMO. I think we're going to see the rise of 'purpose-built kubernetes' distros.
I guess as a user I frequently take this for granted.
Debian also, unjustly, suffers from its non-corporate image/reputation.
Most of my professional sysadmin career has been spent managing RPM-based Linux distributions. The time I was asked to netboot and automatically configure multiple Ubuntu/Debian/Mint workstations, I found that there just wasn't the breadth of netboot/kickstart-type documentation available on Debian based systems. The smaller number of users doing that sort of thing led to less documentation. (I'm not saying that is definitively the case nowadays, it's just that lots of large scale deployments mean a lot of documentation and projects to support such activities)
Having a different focus than fedora, in fact, one could say almost radically different hasn't stopped Redhat from contributing people, hardware and support to fedora -- leaving the volunteers to focus on contributions -- be it by code or as you point out documentation.
Help with 'administrative' or grunt-work tasks goes a long way in large OSS projects and Ubuntu/Cannonical simply haven't stepped up where they should have.
 for business vs for developers, cutting edge vs stability, ..etc
> While this is sad and painful to read, I can't say I'm surprised.
> The problems listed are precisely the kind of problems that Redhat
> strategically supports fedora with, in terms of investment of resources.
> For all the hate Redhat receives it has consistently been a good
> community member by being willing to help fedora in areas that it knows
> are hard and yet not 'cool' enough to attract volunteer contributions.
> What has Ubuntu done for the debian community along the same lines?
Of course, that doesn't stop 'em from throwing in lip-service:
You're right about Red Hat though, they get a lot of negative feeling from people who may not understand just how much they give back to the community at large.
I think GKE's 'cos' is based on the ChromeOS/Gentoo version of CoreOS.
Even if nobody puts their name this week or the next or the week after, and even with the sarcastic remark in the last sentence, it looks like this is not such a big deal after all.
> In other words, the project will manage to muddle along for a while without a leader, though various aspects of business could slow down and become more awkward if the current candidate drought persists.
Why should it muddle along or slow down? All the responsibilities listed (and linked to in a mail in the mailing list) seem to be things that others could take up, either individually or as smaller groups. It's something that could be worked out over a period of time.
> One might well wonder, though, why there seems to be nobody who wants to take the helm of this project for a year. The fact that it is an unpaid position requiring a lot of time and travel might have something to do with it.
The last sentence is probably the issue. Couldn't sponsorships help? The last thing we need is Debian to become corporatized.
Donate to Debian at https://www.debian.org/donations
As implied and cited by the article itself, money and Debian are uneasy bedfellows.
> The last thing we need is Debian to become corporatized.
I highly doubt this will be a likely outcome here.
So far it has worked great for me on my test laptop and I will be rolling out some VM's shortly.
There are quite a few names I've never heard of, and some that I've heard of, but not as DPL.
The last six (after Hocevar) are total blanks for me.
Have my interests changed so much, or did the recent DPLs keep a much lower profile?
And I admit that a few years ago I would rank apt as much better than yum. Matter of preference / habit I guess.
It seems to have served its purpose well so far. It is relatively easy to use and hasn't gotten in the way of anything I've wanted to use it for yet.
You can run mainline on rPi just fine, with standard tools - the only problem is that most rPi projects on the internet won't work as they target Raspbian instead of Linux.
The current Debian "developers" have reached uniform agreed that:
1) Neither of them will participate in election
2) People outside of "developer" circle (including major contributors, who do most of actual work on Debian) can't participate in election
The whole "crisis" is artificially created. Not like that matters, — most of decision making is already done by corporate employees, not some fictitious "community leader".
On the contrary, the project is growing in terms of contributors, packages and internal projects.
https://reproducible-builds.org/ mostly came out from Debian (see https://wiki.debian.org/ReproducibleBuilds )
Despite the name "leader", the role of DPL is close to an ambassador with no power to tell other people what to.
The DPL appoints a handful of roles, then hand-on work is delegated and done by other people.
The drama on blogs and mailing lists is in no way representative of how most DDs feel about the project.