Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> someone who's contributed their time as a volunteer to an open source software project that we have come to rely on, now has some sort of an obligation to drop everything and do more unpaid work for a trillion dollar company

If you could highlight the relevant part of the bug report that demanded the developers "drop everything" and do "unpaid work for a trillion dollar company", that would be great because I'm having trouble finding it. I see "hey, we found this bug, we found where we think the issue is in the code and here's a minimal reproduction. Also FYI we've sent you this bug report privately, but we will also be filing a public bug report after 90 days." And no, I don't think having a policy of doing a private bug report followed by a public report some time later qualifies as a demand. They could have just made a public report from the get go. They could also have made a private report and then surprised them with a public bug report some arbitrary amount of time later. Giving someone a private heads up before filing a public bug report is a courtesy, not a demand.

And it's really funny to complain about Google expecting "unpaid work for a trillion dollar company", when the maintainers proudly proclaim that the likes of no less than Google ARE paying them for consulting work on ffmpeg[1][2][3]

[1]: https://fflabs.eu [2]: https://fflabs.eu/about/ [3]: https://ffmpeg.org/consulting.html





Publicly posting an exploitable bug IS asking for someone to drop everything and come fix the issue NOW.

So when someone finds a bug in software, in your mind the only acceptable options are:

1) Fix it yourself

2) Sit on it silently until the maintainers finally get some time to fix it

That seems crazy to me. For one, not everyone who discovers a bug can fix it themselves. But also a patch doesn't fix it until it's merged. If filing a public bug report is expecting the maintainers to "drop everything and do free labor" then certainly dropping an unexpected PR with new code that makes heretofore unseen changes to a claimed security vulnerability must surely be a much stronger demand that the maintainers "drop everything" and do the "free labor" of validating the bug, validating the patch, merging the patch etc etc etc. So if the maintainers don't have time to patch a bug from a highly detailed bug report, they probably don't have time to review an unexpected patch for the same. So then what? Does people sit on that bug silently until someone finally gets around to having the time to review the PR. Or are they allowed to go public with the PR even though that's far more clearly a "demand to drop everything and come fix the issue NOW".

I for one am quite happy the guy who found the XZ backdoor went public before a fix was in place. And if tomorrow someone discovers that all Debian 13 releases have a vulnerable SSH installation that allows root logins with the password `12345`, I frankly don't give a damn how overworked the SSH or Debian maintainers are, I want them to go public with that information too so the rest of us can shut off our Debian servers.


Responsible disclosure policies for contributor-driven projects can differ from commercial projects. Also, if Google has the funds to pay for bug finding, they also have the funds for bug fixing the community projects they depend on.

> Responsible disclosure policies for contributor-driven projects can differ from commercial projects.

The can, but there's not an obvious reason why they should. If anything, public disclosure timelines for commercial closed source projects should be much much longer than for contributor-driven projects because once a bug is public ANYONE can fix it in the contributor-driven project, where as for a commercial project, you're entirely at the mercy of the commercial entities timelines.

> Also, if Google has the funds to pay for bug finding, they also have the funds for bug fixing the community projects they depend on.

They do. And they do. They literally higher the ffmpeg maintainers via the maintainer's consulting business (fflabs.eu) and they routinely contribute code to the ffmpeg project.


> The can, but there's not an obvious reason why they should.

Of course there are obvious reasons: corporations have the resources and incentives to fix them promptly once threatened with disclosure. Corporations don't respond well otherwise. None of these apply to volunteer projects.

> They literally higher the ffmpeg maintainers via the maintainer's consulting business (fflabs.eu) and they routinely contribute code to the ffmpeg project.

Great, then they should loop in the people they're paying on any notification of a vulnerability.

Of course, if this has truly been the case then nobody would have heard of this debacle.


> None of these apply to volunteer projects.

How so? Volunteer projects have maintainers assigned to the project writing code. The "resources" to fix a bug promptly are simply choosing to allocate your developer resources to fixing the bug. Of course, volunteers might not want to do that, but then again, a company might not want to allocate their developers to fixing a bug either. But in either case the solution is to prioritize spending developer hours on the bug instead of on some other aspect of your project. In fact, volunteer driven projects have one huge resource that corporations don't, a theoretically infinite supply of developers to work on the project. Anyone with an interest can pick up the task of fixing the bug. That's the promise of open source right? Many eyes making all bugs shallow.

As for incentives, apparently both corporations and volunteer projects are "incentivized" to preserve their reputation. If volunteer projects weren't, we wouldn't be having this insane discussion where some people are claiming filing a bug report is tantamount to blackmail.

The only difference between the volunteer project and the corporation is even the head of a volunteer project can't literally force someone to work on an issue under the threat of being fired. I guess technically they could threaten to expel them from the project and I'm sure some bigger projects could also deny funding from their donation pool to a developer that refuses to play ball, but obviously that's not quite the same as being fired from your day job.

> Great, then they should loop in the people they're paying on any notification of a vulnerability.

If only there was some generally agreed upon and standardized way of looping the right people in on notifications of a bug. Some sort of "bug report" that you could give a team. It could include things like what issue you think you've found, places in the code that you believe are the cause of the issue, possibly suggested remediations, maybe even a minimum test case so that you can easily reproduce and validate the bug. Even better if there were some sort email address[1] that you could send these sorts of reports to if you didn't necessarily want to make them public right away. Or maybe there could be a big public database you could submit the reports to where anyone could see things that need work and could pick up the work[2] even if the maintainers themselves didn't. That would be swell, I'm sure some smart person will figure out a system like that one day.

[1]: https://ffmpeg.org/security.html [2]: https://ffmpeg.org/bugreports.html


xz was a fundamentally different problem, it was code that had been maliciously introduced to a widespread library and the corrupted version was in the process of being deployed to multiple distributions. The clock was very much ticking.

The clock is always ticking. You have no idea when you find a vulnerability who knows about it or how or whether it is currently being actively exploited. A choice to delay disclosure is a choice to take on the risk that the bug is being actively exploited in order to reduce the gap (and risk in that gap) between public disclosure and remediations being available. But critically, it is a risk that is being forced on the users of the software. They are unable to make an informed decision about accepting the risk because they don't know there is a risk. Public disclosure, sooner rather than later MUST be the goal of all bug reports, no matter how serious and no matter how overworked the maintainers.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: