Note that this repo has been sidelined, as I have fundamental issues with the protocol SSB is built on. Unless there's changes to how the messages are signed and verified, I'm not planning on putting any serious effort into SSB.
The signature covers the whole message, and is then added into the message it signed. The way this works means you have to encode json exactly the same way as the "main" node.js implementation. Emojis, unicode, html literals, EVERYTHING. Or else it will fail to verify. I've gotten most of it working, but there's still edge cases where I gave up trying to get them to work correctly.
I know that I sound like a broken record, but this is exactly the issue which canonical S-expressions were designed for, and which SPKI wrestled with & solved twenty years ago.
The SPKI version of a message would look something like (I've removed the hash property, because I don't think it makes sense for an object to specify the hash to be used to refer to it, but one could add it back in if one wished):
Canonical S-expressions already buy you bit-for-bit identity when hashing, and SPKI (as an example) wraps signatures rather than injecting them — the only sane choice.
Why do I bring this up? Obviously it's possible to make JSON be a cryptographically-sound format (either by foregoing objects for arrays, or by rules around object-field ordering, along with other rules about encoding), but using it instead of an already-sound format indicates an unfamiliarity with prior work in the field.
Perhaps more to the point, the need for canonicalization in this use case is well understood from not only canonical S-expressions, but similar things done in XML and other cryptographically signed structured data formats. While not using one of the existing formats is not necessarily a bad thing, overlooking the well-established need for canonicalization is quite bad.
Yup, I've got an implementation of http over packet radio I've been working through and came to the same conclusion. Header contains the signature, base64 the message for verification.
Even though it's super low bitrate(1200bps/9600bps) since you have a binary foundation w/ base64 you can send raw binary to get low over-head but then easily verify+decode since base64 is well supported across a wide range of languages.
I had expressed interest in changing it, but it would require changing the core data structure the network is built on, and basically be a breaking change that would almost inevitably result in having to start the entire social web over from scratch. Didn't see huge interest from the devs in doing so.
It sounds like the best path forward might be to document the NodeJS serialization format in perfect detail and create a test suite to verify all known corner cases. Until that's done, alternative implementations will be unreliable.
Makes it difficult to implement the protocol in anything but node.js (and potentially version specific, if node.js ever decides to tweak json serialization stuff)
This is the same reason I lost interest. I may have stuck with it if the Rust crates weren't AGPL/GPL. (And thus can't be used in Apple's app-store which is an interesting target to me).