Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Who are some unsung heroes in open source that need more support?
177 points by notaplumber on July 2, 2018 | hide | past | favorite | 118 comments

All the wonderful people who package software for their distro!

We take for granted the ability to install anything via our package managers, but it's made possible by efforts of a vast number of contributors who often just like a piece of software and stepped up to allow everyone else to use it too. Too often they get zero recognition for their efforts.

I'm doubly grateful because I'm also an upstream maintainer who doesn't have to try to keep up with all the various packaging processes.

If you want to support them, the next time some software is unavailable or lagging behind in your distro, see what you can do to help!

The flipside of this is I think packaging for Debian has become far, far too difficult.

As someone who has packaged a lot of software for Debian, CentOS, Gentoo, NixOS, and GNU Guix, I have to agree -- it's by far the most difficult platform.

I actually believe much of the Docker hype can be attributed to this. Why bother learning packaging when you can just script something and throw it in a container.

Now I just use Guix on any distro for my up-to-date/custom software needs.

I do recall that Debian packaging, coming from the RPM world, had a very steep learning curve, but that mostly felt like an outsized up-front investment, rather than ongoing pain. (RPM was too long ago to say if it was similar for me).

I'm curious [1] which aspects you find make the Debian (and Ubuntu?) system particularly difficult. Specifically, I wonder if those aspects are difficult by design, in order to "force" effort on the part of the packager to extract benefit on the part of the user or distro, if they're arbitrary, or, perharps if they're by-design but misguided.

[1] No horse in the race.. mostly an end-user whose packaging work is limited to internal-use-only.

There's a big gap between making a working .deb file and getting something to pass QA to get into the archive.

It's been quite a while since I looked at it, but the documentation was fragmented and not up to date. Things have gotten better now you can use git-buildpackage [0], but it still feels arcane.

Challenge - take a package like wget and try to update it to the latest version from upstream Git (or whatever). It should be one or two commands but I've never been able to make it work easily. It's even harder if you're trying to port something across from Ubuntu or to use a fork like wget-lua [1]

0: http://honk.sigxcpu.org/projects/git-buildpackage/manual-htm...

1: https://github.com/alard/wget-lua

> There's a big gap between making a working .deb file and

I assume none of us is even talking about this case. That is, a trivially working .deb could provide little more functionality than a tarball, so that's hardly interesting.

> getting something to pass QA to get into the archive.

The last part I have no knowledge of, since I only ever put my packages into internal archives. Were there political aspects, or only technical?

For the technical, I'm still looking for (ideally from the GP, but from anyone who shares the opinion), specifics.

> the documentation was fragmented and not up to date.

This is a common complaint I hear/read, and I recall it was part of my initial learning investment, figuring out where to go for what information. Now it's mostly bookmarked, and changes tend not to be drastic.

> Things have gotten better now you can use git-buildpackage

One thing I found, repeatedly and consistently, was that every single external (i.e. not from Debian) utility that tried to make the process "better" or "easier" ultimately did the opposite, since it would hide or abstract away an important aspect of the packaging process.

> Challenge - take a package like wget and try to update it to the latest version from upstream Git (or whatever). It should be one or two commands but I've never been able to make it work easily.

I can't recall if I've done wget specifically, but I've done this kind of thing before without much issue, assuming that latest version actually builds and runs on that platform without porting work and without needing a ton of new dependencies (or new versions of existing ones).

I tended to find most of my time was spent in chasing down and re-packaging various libraries whose older versions were no longer good enough.

> It's even harder if you're trying to port something across from Ubuntu

I'm not sure what you mean, since Ubuntu is already using Debian packaging.

> or to use a fork like wget-lua

Certainly forks are going to be more effort, but my experience is that this is true even in their unpackaged state, if they require more exotic (or specific) dependencies be pre-installed, a custom build process, or any porting work. If someone had already done all that work, comprehensively documented it, but merely not translated it into debianization, it would save me a ton of time in creating a custom package.

Would you be willing to share your bookmarks?

Sorry if you get asked this a lot, but why Guix over Nix?

I know I'm going to get downvoted to hell for this, but am I the only one who thinks of these people's efforts as a huge amount of wasted time? Why should anyone have to package software aside from the developer who made it? Why should it have to be done for so many different package repositories and packaging formats? Why are people ok this?

Because the software is inconsistent (and thus buggy) in that regard. Also, barely any developer possesses the knowledge and infrastructure necessary to build their software for dozens of arcane architectures.

Packagers create the glue between the software (which is heterogeneous) and the distro (which is internally consistent). The world without this glue is a horrible mess and a huge waste of time; the old Slackware is a good example. It's thanks to the distro makers that you don't have to spend days collecting requirements and acquiring arcane knowledge just to install a piece of software.

So: yes, those people devote a huge amount of time - so the countless users around the world don't have to. The net time value is positive.

I think we need to distinguish between software that is used as a component in a larger system (e.g., the core OS for which is the core business of a distribution), and an application that is not part of the larger system (distribution) but merely runs on top of it. For putting together a distribution (with tightly integrated components) the traditional packaging philosophy is probably very well suited For add-on third-party software that is not an integral part of the distribution but merely wants to run on it, not so much.

An independent software author (e.g., Ultimaker or Prusa) just wants to reach all "Linux" users at once without dealing with different distribution's policies, and without needing to use the same version of e.g., Qt, that happens to be in a given distribution. And as a user of their software, I want their software on my "Linux" system in the same moment Windows and macOS users can have it.

A lot of the strength of open source comes (much like in computation in general) from chopping tasks into smaller and smaller pieces. The upstream developer is unlikely to be skilled at packaging for a number of different platforms. If platform users can rely on others who are skilled at packaging, we all benefit.

The biggest reason, both historical and present, that we have many different package repositories and packaging formats is because of the concept of shared libraries and the desired benefits: security, stability, speed, freshness of volatile data and disk usage.

And from what I have seen there is only a primary alternative being proposed; containers. Have a copy of all needed libraries in every package and leave it to the developer to patch and fix security. Occasionally put a few things into the operating system, but then you have to hope that those parts don't change or you have to make different packages for different version of the operating system.

What we need is a clear separation between the Core OS a.k.a. base system which should be provided by every "Desktop Linux" distribution, and the rest.

Applications should only use those shared libraries that come with the Core OS a.k.a. base system, and either link statically to or bundle the rest.

Like an iOS application can only consume what iOS provides or bundle any additional dependencies privately.

The result would be a much simpler and more resilient system (at the expense of some storage and memory overhead, which is the lesser evil imho).

> The result would be a much simpler and more resilient system (at the expense of some storage and memory overhead, which is the lesser evil imho).

On balance, I agree with the conclusion. However, coming from a non-desktop viewpoint (server, not embedded, though I do sympathize with the latter), I don't think it's obvious that "some" overhead is worth it, nor has it historically been worth it.

At scale, size can matter, though, like I said, I think today, nobody would even notice.

It's tough to "fight" that history, though, so we go through the pain of even more overhead of full virtualization before cutting it back with OS-level virtualization (a.k.a. containters) and (re-)declaring victory.

I disagree that it's a waste of time. (But I upvoted, because I think it's an important question, thereby adding to the conversation).

> for so many different package repositories and packaging formats?

How many are there, though, really? In theory, the number is unbounded, but, in practice the number of distros is modest, the number of popular ones is smaller, and the number of unique packaging formats even smaller.

Although a sibling comment alluded to it with distros being internally consistent, I wanted to unpack that a bit more.

Specifically, one benefit I've found as a "user" (sysadmin/devops) is that of well-defined dependencies. This isn't inherent to packaging, but it tends to be a feature of the more mature systems and distros.

The other benefit is that it provides a more universal mechanism of traceability and, thereby, at least a path to reproducibility of builds. This has implications for security, of course, but also for debugging.

I’m fairly sure that it often is the developers packaging it for different platforms...

Sometimes though, people do it...and it might even be automated.

Systems like the Open Build Service can ease the pain a bit by building for different distributions and versions, but it is a pain nevertheless. Luckily the Open Build Service instance at https://build.opensuse.org/ can also do AppImages, which run on most "Desktop Linux" systems.

Upvote from me, I fully agree with you. I'm sick and tired of people re-packaging python/npm/ruby/etc applications as distro packages. As a maintainer of a few high-traffic python libraries, it wastes my time. I am not ok with it.

Do you have some details about how this wastes your time?

Paul Davis of the Ardour project. Ardour is probably one of the best digital audio workstations ever made. Sad thing is that Ardour is such niche software that it doesn't get as much monetary support as it should.


Indeed. I make an automatic small monthly contribution of a few dollars to the project (they make it super easy to do it).

BTW since Harrison Consoles have a commercial version of it (MixBus/MixBus32C) and are actively contributing to Ardour, I wonder if they are also supporting it financially.

I hope so! I think the MixBus team has contributed some code back to Ardour.

AFAIK they have, I wonder if they're also donating.

BTW I've switched from plain Ardour to MixBus a few months ago, and I'm not sure why but it produces much better results for me.

I sure hope they are donating. That would be fantastic for the future of the project.

The better results are probably due to Harrison's EQ and console bus emulation.

Wow. Am I missing something or is Ardour a more capable kind of Audacity?

It's light years beyond it, it's a full fledged digital audio workstation capable of high quality studio recordings and mixing. A good analogy would be Paint vs. Photoshop.

EDIT: Not to downplay Audacity which is also awesome, it's just not meant for the same use case.

Nice, and free. I think my podcasting friends may like this.

Ardour is an NLE(Non-linear editing) Audacity is not. Also, Ardour's plugins act in realtime. Ardour is used for recording and mixing music, film, and other things. Audacity is mainly an audio editing program.

Audacity never required me to save over existing files or magnetic tape. Why do you say it's a linear editor?

I looked into Ardour when looking for a linux DAW but the lack of a piano roll makes it immediately no go for me

There is a piano roll, you just need to vertically expand a MIDI track in order to see it.

how does ardour compare to ableton?

Consider supporting Henry Zhu, the maintainer of Babel. Henry decided to dedicate himself 100% to open-source earlier this year and is one of the main reasons Babel is such an indispensable (albeit invisible) tool. Henry welcomes contributions at https://www.patreon.com/henryzhu/memberships

Babel has an open collective: https://opencollective.com/babel

Babel has some crazy throughput going through it to drive a massive percentage of the modern web. Wild to see how low the funding is.

seriously. its insane that some startups are raising VC money that 100% would not be able to exist without Babel being maintained. Some of that money should go to Babel but won't. How do we fix this so that Henry doesnt have to keep begging? It's really broken.

Start from the projects you are using. You might find that many of them are primarily one-man efforts and could use some support.

For example, we use verdaccio (https://www.verdaccio.org/) as a private NPM server. IT was so mindblowingly simple to get a private server working that, when the opportunity presented itself, we felt compelled to donate: https://opencollective.com/verdaccio

In this case, it seems that there was an older abandoned effort (sinopia) and the original developer Alex Kocharin (https://github.com/rlidwka) stopped. Juan Picado (https://github.com/juanpicado) picked up the mantle and given how NPM is moving fast and breaking things it's great to see someone is keeping up with the open source solution.

The people building MicroManger (and ImageJ) - the (best/only/reasonable) open source software capable of driving most microscopes used in biology - and then processing that data. It's raw, it's complicated, and (non-technical, demanding, one-off) biologists are the end users. But piped through that software is most every piece of live/functional raw data used in all your cancer/genetics/genomics research. It's also a core to many other downstream, forked, or scripted sets of machine-control, and data-processing. It's a thankless job that literally drives humanity forward.


Nico Sturman used to be one of the project leads, but it's been a while since I've kept up with who runs it now.

Urban from LibrePCB, who has been developing a free/libre EDA suite (for PCB design) mostly by himself for over 4 years now: http://librepcb.org/ The first release will hopefully be out this year.

If you think a FOSS KiCAD "competitor" with a solid and well thought-out library design and good usability should be supported, then check out his Patreon page (or his BTC address). Or – of course – contribute code.

Here you can find a LibrePCB introduction video from this year's FOSDEM: https://fosdem.org/2018/schedule/event/cad_librepcb/

(Edit: Punctuation)

Interesting! I'm a KiCAD user but have never heard of this. Why doesn't Urban just contribute to KiCAD instead of maintaining a separate open source project?

Apologies if he mentions it in the video -- I'm at work.

>> Why doesn't Urban just contribute to KiCAD instead of maintaining a separate open source project?

Not every open source project is built the way a person thinks it should be. The only way to know if an alternate viewpoint is better is to build it and find out. Other times a person just wants to build it for themselves for their own reasons. Either way, variety can be a good thing. One day an alternative may just replace your favorite piece of software and then you might ask why the creator started from so far behind all those years ago...

There are probably as many reasons as there are developers.

Do you know anything about this project or is this just a generic diatribe?

I was hoping to hear about his opinions of the shortcomings of KiCAD and where LibrePCB improves on them. It would certainly help someone like me decide whether to fund him on Patreon or not, seeing as I've donated to CERN for KiCAD development.

Edit: I've found his reasons in his slides - https://www.fosdem.org/2018/schedule/event/cad_librepcb/atta...

As of a couple years ago development of ntp was largely performed by one Harlan Stenn. He (and the funding issues around ntp) made the news a while back, but the guy probably deserves even more visibility, given the crucial nature of his work.

2015 article: https://www.informationweek.com/it-life/ntp-harlan-stenn-and...

Indeed, Harlan and the Network Time Foundation (nwtime.org) are continuing to work on the NTP protocol and related subjects.

And more support is most definitely always welcome!

NB: I am an unpaid volunteer for the NTF and the NTP Public Services Project at ntp.org.

Notoriety means infamy, or the state of being well known for something terrible. Did you mean a different word?

Really? I didn't know that,in Italian it a similar word translates into "famous". I wonder how many times I've used it incorrectly...

"Notorious" does mean "famous", but it carries the added implication of being famous for being bad. I suspect the meaning has changed over time, just like "awful" used to mean "awe-inspiring".

I think you mean awesome? Or have I been wrong all along?

"Awesome" still carries a positive meaning. "Awful" changed from being a synonym for "wonderful" to a synonym for "terrible".

Ivan the Terrible was originally Ivan the Awful, as in inspiring awe rather than terror.

Thank you, yes, it's been corrected---"visibility" serves the purpose I intended.

Ornicar[0], author of lila[1], the chess server behind the greatest chess site of internet[2].

[0] https://github.com/ornicar [1] https://github.com/ornicar/lila/ [2] https://lichess.org/

Jonathan Thomas over at Openshot Video Editor:


It's had its share of bugs, but the amount of work he has put into it over the years is pretty incredible. Most other video editors for Linux don't even come close, and it recaptures a lot of the lost glory of Windows Movie Maker before it was rewritten and made ugly.

Oh yes! OpenShot is so great. I haven't used it in a couple of years but when I did it was a pleasure to use as an occasional user with 0 knowledge of video editing software.

Davinci Resolve blows everything else out of the water, and it's free. Available for Linux, Win, Mac.

Not open source. There's nothing wrong with that, except it's not what we're talking about here.

OpenBSD is heavily underfunded compared to what value they deliver. See https://www.openbsd.org/innovations.html

Jen Fong-Adwent, Jeff Lindsay, Dominic Tarr, James Halliday and Ben Lupton are the five I really admire and think their ideas don't get enough exposure (I realize they've all had successful projects but talking to them at length their ideas generally speaking are fantastic and should be encouraged).

https://github.com/ednapiranha https://github.com/progrium https://github.com/dominictarr

https://github.com/substack https://github.com/balupton

With my OpenBSD developer hat on, hardware donations welcome here too: https://www.openbsd.org/want.html

cough-- I also very much like pizza: https://brynet.biz.tm/wallofpizza.html

The OpenBSD foundation also hasn't received any corporate donations as of yet for 2018:



> cough-- I also very much like pizza: https://brynet.biz.tm/wallofpizza.html

But is it Pizza Pizza?

I'll eat Pizza from anywhere! Even Pizza Pizza.

Shay Banon - Founder of Elastic Search. He's been in the Open Source space for years building frameworks like Compass which influenced Red Hat to build Hibernate Search. He's done other work as well.

Ross Mason - Founder of Mule Soft. He built Mule which is Open Source and helped many companies better integrate services.

Joe Walnes - Joe is a beast. He's contributed and created many Open Source projects. websocketd, smoothie, xstream, webbit, and many more.

Rick Hightower - A very smart guy who use to blog about technology that helped many many people. He also created Boon a JSON parser, qbit, and many other frameworks. Because of all of his hard work, he was also made a Java Champion.

I could name so many more folks. Honestly, it's amazing not more people are acknowledged for their efforts.

My partner Mike Schwartz founded Gluu because he was tired of recommending proprietary access management platforms like Siteminder and IBM Tivoli that were locked behind six figure licenses.

We're now a team of about 30 people with 10 years of development into the Gluu Server, a free open source software platform for SSO, 2FA, access management:


Ironically, the reason there are so many unsung heroes of open-source is precisely because their products are not "locked behind" expensive licenses.

It sounds though like this is, in fact, a commercial venture.

I've always been mystified about what the heck is Tivoli. You indicate it is an access management system. Can you please provide more details ... I'm intrigued.

Chet Ramey.

He is the maintainer of the GNU Bash shell and of the GNU Readline line-editing library.

* https://tiswww.case.edu/php/chet/

* https://www.red-gate.com/simple-talk/opinion/geek-of-the-wee...

The html5shiv was originally based on a discovery by https://twitter.com/sjoerd_visscher and represents a iterative process involving many developers, at least according to Paul Irish https://www.paulirish.com/2011/the-history-of-the-html5-shiv...

Chris Hobbs / RX14 (major crystal language contributor) is 17 years old, is a genius, and is behind a lot of the recent work on the language including windows support, parallelism, and other things. You can follow him on GitHub or watch him work on crystal on twitch:

https://github.com/rx14 | https://www.twitch.tv/rx14

Loïc Hoguin. The author of Cowboy web server. De facto standard web server in Erlang/Elixir. Foundation for Plug and Phoenix frameworks.

Shotcut Video Editor: https://shotcut.org/

- and -

Natron video compositing software: https://natron.fr/

Both are great tools with a steady stream of updates.

SparkleShare / Hylke Bons

An amazing self-hosted cross platform file syncing/sharing service. Being based on git allows for (somewhat) unlimited version history. Simple and elegant, really surprised more people don't know about it.

Ben Darnell of Tornado and Mike Bayer (aka zzzeek) of SQLAlchemy.

thanks! I was even going to post myself (SQLAlchemy). However our deepest areas of need are the most boring and soul crushing - documentation and bug triaging. Ideally someone else with full push / release access other than me. We get lots of great patches and pull requests but I'm the only one moving it all through. I'm probably not easy to work with (but I'm open to improvement!)

Consider supporting George Nachman, creator of iTerm2 over at https://www.patreon.com/gnachman.

iTerm2 seems to be the favourite terminal of macOS users… at least those that spend a large amount of their time there (especially with its native tmux support).

Joey Hess:

- https://liberapay.com/joeyh/ - https://www.patreon.com/joeyh/overview

(I support Joey on patreon, as I use git-annex quite a bit)

Please support GNU Octave if you can.

I used to be a big fan of octave until I learned python + numpy + pandas dataframes. After that I did not have much need for octave.

There are a lot of engineers and financial quants using Matlab, and the Octave project makes a lot of that Matlab code usable without an expensive Matlab license.

I guess it's not clear to me why (at least in the 'first world') it's wrong for quants and engineers to pay something for the tools they use. (i have spent a fair amount of time contributing to open-source, so I've no problem with OS generally, but I also don't think their is anything wrong with the folks at Matlab getting paid for their efforts. Quants are sure as hell getting paid for theirs).

I agree that Matlab has a right to be paid for its product, but it is still nice to have a free alternative.

The better the free software, the greater the exploitation by other people making a bunch of money on it, while the devs work nights and weekends for nothing. I don't have a solution, but there it is.

I just think that Matlab shouldn't be taught at universities.

I used to use and love GNU octave, but much prefer Julia these days, which is also open-source.

That dude that maintains jq. For those that haven't used it, it's like grep but for JSON (and about as fast).

The ones that undeterred maintains software they haven't built themselves. For instance the maintainers of Mithril.js (pygy, tivac, isiahmeadows) is/has been doing great work for a long time, with nothing in return. There are many like them out there on other projects, I'm sure, who doesn't get the thanks that they deserve.

Don’t sell yourself short porsager, you’re apart of that finely tuned Mithril machine too.

A guy called Alexey Tulinov maintains an excellent but very little known, light-weight unicode processing library called libnu. I really wish it had more recognition: https://bitbucket.org/alekseyt/nunicode

Two other FOSS projects I like but are not super known:

* Calf: A small group of devs producing a set of high quality audio processing plugins. https://calf-studio-gear.org/

* Ardour - pretty well known I think, but only in the narrow cross-section of audio geeks and Linux geeks. https://ardour.org/

Serge Rider - DBeaver - the best open source multi database accesa tool. https://github.com/dbeaver/dbeaver . Daily commits, almost a one man show!

That's an app I use daily. A couple of years back I tried to send him some money, as a way of showing appreciation for his work. He wasn't interested so I kind of assumed money wasn't an issue for him.

Araq and the other Nim core developers.

OpenBSD core developers.

Gael Guennebaud. The Eigen library powers tons of numerical heavy code -- including Tensorflow -- but doesn't get much spotlight. Watching Eigen evolve under Gael's stewardship has been amazing.

There's an NPM package started for this: https://github.com/feross/thanks

Sébastien Doeraene (srd) and David Barri (japgolly).

Former built almost all of ScalaJS and the latter built ScalaJS-React.

It's inspired a lot of KotlinJS as well.

Definitely the DB Browser for SQLite team (sqlitebrowser.org). :)

We recently created a Patron account, if that helps:


RIBS2 (Robust Infrastructure for Backend Systems, ver. 2) https://github.com/Adaptv/ribs2

Great framework to build highly scalable, low latency service (developer friendly). It would be great some detailed tutorials and documentation about its internals.

Jonathan Westhues, author of Solvespace. There are a number of hard things to implement in that code and he did them all initially. Others have been making good contributions too, but it could use more developers. I've been digging in the code myself but have not made a pull request yet. We shall see...

Anyone who makes sure stuff compiles and runs on Windows.

Is there anyone at Microsoft that you would like to especially single out? (-:

I actually wasn't thinking about Microsoft in particular when I wrote this.

Git for Windows, Audacity, Python are all good examples where people have put in a lot of work.

On the Microsoft side, whoever's porting OpenSSH is pretty awesome, too.

Fabrice Bellard.. He is already popular a bit... But most don't know about him....

openmhealth.org - we're build an open data standard for mobile health data.

PSA: nominating yourself in threads like this is pretty tactless.

Calling people tactless after they've worked their asses off building something they give to strangers for free is more tactless.

Why? If an author or maintainer of a FOSS project wants to highlight their own work, so the community might too benefit, what's the issue?

Well, calling yourself a unsung hero ... is maybe a bit, well, low. Usually others judge, who is a hero.

This isn't a Show HN post with an open bandwagon. It's about nominating underrecognized achievers, the recognizance of which is only objectively done by someone other than the subject in question.

I'm ok with being tactless, if it highlights something important. :)

eg I just pointed out the project (DB Browser for SQLite) that I've been helping out for years. We're widely used, people say very good things about us, and we could definitely use more funding. ;)


Who is he?

Applications are open for YC Summer 2021

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact