Anyone this person convinces to use and contribute his fork are going to take time and money (yes, money - the guy put up a Patreon campaign and BountySource page) away from GnuPG, which sorely needs it, as we all hopefully remember. Please do not use this, and I hope the authors give it up.
I would love to do code review, but I need a second developer for that!
Thank you for your interest and taking the time to write down your criticism.
Oh no it isn't. It does have more features, and for some fields, sure it is better (user interfaces, simulations).
However, this is security software we are talking about. C++ introduces more complexity as the spec is an order of magnitude bigger than, say, C99. More complex language spec, more complex tooling, lots of places for bugs to hide. It can help with classic buffer overflows and the like, sure. But it introduces its own set of issues.
>Many of these issues are well-known or can be easily researched. They have been documented many times, and can be avoided.
Right. The issues can be avoided. You just need to not make mistakes, right?
Also, this is worrying:
> Converting the legacy code base (490,000 lines of code) to C++ was a straightforward, mechanical task that took only a couple of hours
So we are good? The fact that it compiles (and maybe even works) doesn't automatically validates it as a secure rewrite (and it is a rewrite, no matter how similar the common language set is). The stakes are just much higher for something like GPG.
I do wish this project good luck, though.
That sounds like they haven't actually converted it to C++ so much as compiled it as C++.
That means they're probably still using raw owning pointers and no RAII, which means you get none of the safety benefits of C++ (which do exist, so long as you use modern C++ practices like smart pointers, and containers instead of C-style arrays).
There are many programming languages, and I believe in picking the right tool for the job. In the case of NeoPG, the priorities were:
- Support for strong cryptography.
- Compatibility with C application developers.
- Convert legacy code quickly.
- Tool support for QA.
Everything else is, at this point, a secondary concern. The Sequoia Project uses Rust, and I envy that. But the first thing they had to do was to wrap an existing C crypto library (they choose libnettle), because there is no high quality crypto library for Rust yet. That is their challenge.
My challenge will be to stay focussed on the parts of C++ that are actually helpful, and not get bogged down by the rest.
I know it's mostly semantics but calling C from C++ still requires some wrapping(extern "C", integrating build system, etc).
I've found with bindgen unless there's some crazy macro shenanigans going on it's actually quicker for me to integrate C libraries into Rust then wrangling CMake/Make/etc. The C FFI is very much a first-class citizen in Rust.
Even doubly so if we're talking about a cross-platform library.
A blog post on the decision to use an external library might be good. Is there a reason to think that the library will be better maintained than code with similar functionality in GPG?
Great work by the way! Thank you!
Comments like "couldn't they have used $niche-lang.org instead of [C++/Java/...]" bore me, to be honest.
I think C++ is not a great choice for complex crypto software because it doesn’t have very good safety-by-construction properties and it has to deal with (extremely) untrusted and probably malicious input. Parsers written in C or C++ are historically one of the biggest attack vectors out there.
I choose languages with tight control on memory for crypto-related stuff since you want to prevent leaks. You'll follow with a covert-channel analysis to be sure. If not crypto and stopping code injection, then a memory-safe language with interface checks and input validation will cover most problems. Those have been around since one was deployed in first, business mainframe: Burroughs B5000.
When I wrote it, though, I was mainly thinking of recent work that converts functional, verified specs into imperative specs that extracts to ML. I think that has a lot of potential for producing a pile of verified data structures with low cost similarly to the COGENT language that was used for ext2 filesystem. Got a link for Imperative/HOL-to-ML below.
We all know C++ is a valid design decision. We all know more hip languages have underdeveloped ecosystems. Yet opinions on which language to choose are not devoid of value because it indicates interest, sparks conversations, and may initiate development of the tools needed to make them less 'underdeveloped'.
The only thing we don't need is people saying another person's opinion is boring.
EDIT: Also, they said they were disappointed they used C++. Maybe they wished they stuck with C?
(Not an attack -- a genuine question because I'm interested in your choice and reasoning.)
Whatever value you assign to $LANG, you will always encounter a denizen on HackerNews that disparages $LANG and would've used $LANG-prime, but by not actually moving first to start a project to achieve that aim with the tool of their choice, they kind-of ceded their right to criticise.
That’s my point of view, at least: don't criticise the artist's choice of tools. It's rude.
C++ has an excellent C FFI :-)
I understand the author wants to clean up the core and the API, but if there's already a project that cleans up the API isn't that good enough? Especially considering the standard it implements-- PGP-- is still quite complicated.
This might just be a misconception I have, but I always thought you couldn't make a batteries-included, self-contained, works-out-of-the-box app if you used GPGME. You'd first have to tell the user to go get a GnuPG implementation from somewhere.
I'd be super happy to be wrong on this.
A standard without multiple independent implementations is a bit unhealthy..
One of the reasons we built PGP signing capabilities into Krypton  is to make it easy for anyone to sign their Git commits/tags. Almost nobody uses this awesome feature of Git. We even ended up implementing parts of the OpenPGP spec in Swift .
It may be opinionated but not well thought of a fork.
How is possible to take GPL 3.0 code, say how it is "more restricted", miss to mention the point of what would happen if it is not, and to claim that any additional code to the project shall be licensed under the BSD type of the license.
The author has some serious misunderstands about licensing issues.
Starts about 47:25 in if the time-index doesn't work for you.
Just reorganizing existing commands into subcommands like 'gpg encrypt' instead of 'gpg --encrypt' :)
As for key management, keyrings and trust, I think a proper UI with decent explanations would help a lot too. This isn't rocket science, but it's too hard to guess what various abbreviations mean.
I wonder what do you think is complex? Trust model maybe? Different kinds of signatures? Negotiating algorithms to use?
For me it culminates in the way gpg feels really opinionated about key management (with keyrings). Way too often (in relative terms) I end up creating a temp dir, setting GPGHOME, then setting some permissions to quiet up gpg, then importing the keys, then actually doing the thing I wanted, and finally cleaning up. I have no doubt the keyring design works wonderfully for gpgs author(s), but for a tool that should really be more generic than that it feels less ideal.
gpg being as monolithic as it is probably is the fundamental problem here which, in addition to making it unnecessarily cumbersome to use in some cases, also makes it more difficult to learn piecewise (imo).
 For one example, see my comment here: https://github.com/keybase/keybase-issues/issues/2230#issuec... That operation should be basically "curl ..|gpg-key --to-ssh", but instead it exploded into 10 line bash script, complete with parsing gpg output with grep/awk.
Just as much as knowing how git works under the hood doesn't necessarily make you great at managing branches in git. More often than not it's way easier to just go with a sort of ready-made recipe, so that your workflow would be easily accepted by the industry; rather than read the doc.
These two articles describe trust in detail:
GPG esoteric options is also a good read:
Besides that... the RFC itself I suppose: