Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Is it time for a public GPG audit?
94 points by anotherhue on Sept 15, 2013 | hide | past | web | favorite | 38 comments
With GPG being a common destination for those concerned by the recent privacy revelations, it bothers me a little that I can't find any audit or security review of GPG's codebase.

The Wikipedia page says that a German IT ministry funded a windows port, the EU Agency for Network and Information Security list GPG as part of their index of tools and claim it's in use by some related parties [0] but don't go so far as to recommend it. Considering that several governments within the EU are allegedly complicit in the SIGINT scandal, I don't think their word counts for much.

GPG is open source, but while the code is readily available the knowledge and background to determine its security is somewhat rarer. Would you be willing to contribute to a project to fund a public audit of the codebase? If so, what sort of people would you like to see participate.

[0] http://www.enisa.europa.eu/activities/cert/support/chiht/tools/gnupg-the-gnu-privacy-guard




That's not really how "audits" work. Coordinated public audits are responsible for a tiny fraction of all vulnerability discoveries. Most discoveries are independent. There would probably be a fairly poor return on investment for funding an official audit.

Bear in mind also that even though you've never heard of an audit of GPG, GPG is actually a pretty high-profile target. Smart people have already looked at that code pretty carefully.

Since GPG is an open source project, a better approach would be to find a way to sponsor a bounty for vulnerabilities in GPG. But here too you'll run into problems:

* It will take fo-re-ver to adjudicate what does and doesn't qualify as a serious finding. Google and Facebook manage this problem by hiring very smart vulnerability researchers and allowing them to come up with criteria pretty much by fiat. Here, you're going to end up in a 2-month-long argument about whether man page bugs are vulnerabilities because of the nature of the project.

* Output of these programs is nonlinear and unpredictable, so it'll be tricky to figure out how much money needs to be set aside to satisfy reward payouts. In the meantime: who holds that money? And where does it go when the bounty outlives its utility?

If you really want to do some good, consider starting a project (which would require no funding) to either:

(a) Build a replacement GPG in a more modern development environment, or

(b) Annotate all of GPG's source code.


I actually audited gpg in 2006 while studying, it was an interesting project. Here are some of the results:

Remotely controllable function pointer http://lists.gnupg.org/pipermail/gnupg-announce/2006q4/00024...

GnuPG does not detect injection of unsigned data http://lists.gnupg.org/pipermail/gnupg-announce/2006q1/00021...

False positive signature verification in GnuPG http://lists.gnupg.org/pipermail/gnupg-announce/2006q1/00021...


That's quite impressive, what were your impressions of the codebase in general? Were you aware of anyone else performing similar work at the time?


It would be useful to quantify this for various projects with an estimate of how many hours have been spent probing the codebase by well-intentioned security experts. Just like negative experimental results in the sciences, such findings are rarely published.


I agree, that would be extraordinarily useful. But well-intentioned security experts usually (a) don't really track their hours on side projects, and (b) have a status-derived incentive to lowball.


>(a) Build a replacement GPG in a more modern development environment

(c) Build a replacement GPG that developers can (easily) use as a cross platform library.


AFAIK there are already two libraries that can deal with OpenPGP messages, it'd be better to improve one of the existing ones than rewrite it yet again from scratch :)

* NetPGP (BSD licensed, recently imported into OpenBSD 5.4): http://www.squish.net/pipermail/odc/2013-March/014482.html

* GPGME (GnuPG Made Easy, it invokes the gpg executable behind the scenes): http://www.gnupg.org/download/index.en.html#gpgme http://www.gnupg.org/documentation/manuals/gpgme/


Gutmann's cryptlib, as well, I guess.


Bouncycastle does some of this stuff in Javaland.


Without better documentation, I doubt anyone can actually use the baroque Bouncycastle PGP/GPG APIs without messing something up.


This has already been done by people who understand the fundamentals of crypto. Check out http://nacl.cr.yp.to/

It is even available natively in Go http://godoc.org/code.google.com/p/go.crypto/nacl


I thought I might have been this site's loudest proponent of Nacl, but maybe not? Here, though, Nacl isn't a useful suggestion; it isn't compatible with PGP. (There is a project that wrapped PGP's UX around Nacl's crypto, but one presumes it's virtually unused, since it can only talk to itself).


I was trying to address the "library" aspect, but I agree NaCl is a poor replacement for GPG in user facing functionality.

Anyone who is in need of crypto I push to NaCl, GPGME, or exec out to GPG itself before trying to build it themselves.


Nacl is good, but is Go a suitable language for crypto work yet? I understand there were some reservations in the past, though I can't recall exactly what those were.


I think Go is in a bit of a chicken-egg problem right now w.r.t. crypto. I recall one of the core devs saying you shouldn't use it for anything where security really matters, because their implementation hasn't really had the tires kicked. The tires probably won't be kicked as much with that sort of suggestion though either.


Go serves HTTPS for many Google properties and is a first class citizen for security applications internally.


Good to know.


The only issue I recall is that the random-looking data involved in crypto confuses the conservative GC, so your app is unlikely to release memory in a timely fashion. More of a problem in the small address space of a 32-bit application.


I was oblivious to this. Do you know of any projects using it to implement OpenPGP?


>>It will take fo-re-ver to adjudicate what does and doesn't qualify as a serious finding.

>>Output of these programs is nonlinear and unpredictable, so it'll be tricky to figure out how much money needs to be set aside to satisfy reward payouts

How many tech billionaires and multimillionaires are there? Let them kick some back to the industry so things like this gets done. Even if they donated a relatively tiny amount, the above points become moot, they'll plenty of money to attract the best hackers and everyone will be happy. If Google, FB, Yahoo and Microsoft donated $1 to $5 million each will they even feel it? Nope, maybe they'll even be able to claim a tax deductions.

>> In the meantime: who holds that money? And where does it go when the bounty outlives its utility?

A foundation?


Who manages the foundation, and won't tinfoil hatters just assume the foundation is being corrupted? Especially if its primary benefactors are the large corporate entities that they're already extremely afraid of?


How would the foundation get corrupted? If someone found a vulnerability, and wanted to disclose it to them, then they would just go public if there was any "funny business" (e.g. refusing to pay-out, trying to keep the vulnerability to themselves, etc).


You could insist on a long secrecy period "to get the fix tested and rolled out", sell the vulnerability to the NSA and delay the fix for it as long as practically possible.

Or the NSA could just read the vulnerability from your e-mails. It's not like the NSA needs to pay.


I think that the NSA's game plan would be the other way around.

1) Write a patch the implements a subtle security flaw in GPG. 2) Kidnap Bruce Schneier's dog 3) Send ransom note to Bruce Schneier with instructions 4) Bruce Schneier announces fake security hole in GPG and submits tainted patch to foundation 5) Foundation rejects tainted patch 5.5) Have Schneier make a big stink about the foundation not fixing the (non-existent) bug or implementing the (tainted) patch 6) Assassinate Schneier to keep him quiet 7) Mainstream declares it ridiculous that the NSA would have killed Schneier 8) Start rumor that the NSA hijacked the foundation and killed Schneier for trying to patch a critical bug. 9) The foundation's report on the security of the original code is now discredited. 10) Previously secure individuals now install the tainted NSA patch to protect themselves from a non-existent bug 11) If anyone points out the flaw in the patch or the non-existence of the original bug, hire sock-puppets to accuse them of being NSA sock-puppets.


I agree with you completely. Just saying that the involvement with corporate sponsors will bring out the conspiracy theorists in droves.


Git provides us with a great deal of transparency.

Here's an overview of GnuPG's committers:

  Werner Koch:      2677 commits over a period of 5764 days.
  David Shaw:       1197 commits over a period of 3807 days.
  Marcus Brinkmann:  202 commits over a period of 3753 days.
  NIIBE Yutaka:       53 commits over a period of 641 days.
  Moritz Schulte:     39 commits over a period of 1756 days.
  Timo Schulz:        29 commits over a period of 896 days.
  Stefan Bellon:      21 commits over a period of 765 days.
  Repo Admin:          9 commits over a period of 2634 days.
  Andrey Jivsov:       8 commits over a period of 37 days.
  Ben Kibbey:          6 commits over a period of 20 days.
  Neal Walfield:       5 commits over a period of 1 day.
So it looks like the codebase has been touched by remarkably few hands!

This doesn't negate the need for a code review of some sort, but it does suggest that it would be difficult for an outside agent to silently introduce changes in master without the core developers noticing.

EDIT: Formatting.


I think the threat model the OP is worried about is a three letter agency finding an unintentional bug and keeping it for their own use. He's asking for an audit from someone we can trust to actually turn over the vulnerabilities.


I think it would be rather easy to infiltrate a development team, especially a geographically distributed one.


This is a movie plot threat.

Most vulnerabilities in software are not caused by suborning one of the developers -- developers create vulnerabilities all the time without being subject to adversarial action. The adversarial actor just has to look at the software artifact created and find things about it that the original developer does not realize are true.


What about cracking the password to one of those people's Git account, and then submitting a commit without anyone knowing? What if Github is compelled by a National Security Letter to allow others to modify source files and erase the logs for those changes? What if one of those guys is the next Sabu - facing 50 years in jail unless they cooperate?


You realize that you cannot modify a commit's content without modifying the commit hash for every subsequent commit?

Your attack would be practically impossible against a git repo.


There's a possibility that a developer acted in bad faith, for the sake of love, money, or fear. I have no reason to think that this has happened, but since it is possible, and the code is open, we can check.


It is time for a public OpenSSH audit.

Word on the street is the code is horrific and last I checked was not even checked into git, in any way, yet.


It's not checked into git... because they use CVS.


The OpenBSD guys are usually pretty good at security, but does CVS have the integrity verification of more modern RCS?

Since OpenSSH is probably responsible for more data than GPG, (the importance of that data aside) I think you're spot on.


Would it make sense to somehow record what was audited and by who, in "machine readable format"? Something that would allow others to later check how much the audited parts of code (or code that the audited part is relying) have changed since the audit.

Could be for example just a simple message "Audited, file: aaa/xyz.c, checksum 3ea1b.. revision 1c030.." signed with auditors public key.


It's always a good time for a revision of privacy/security tools.


Have a look at the changelog. It's not as if people haven't been looking at it.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: