Very good write up!
Something I've been curious about is if there are proxies that can convert protocols. So for instance, on your local machine, you could have a proxy that turns TCP connections on port 8000 into FASP connections to another machine. This would let you use an ordinary web browser over FASP.
You could even pipe the proxies, e.g., TCP -> FASP -> MinimaLT . That way any program could have really fast data transfer over an encrypted tunnel.
This seems like it would be more easily implemented with smudge/clean filters. In fact, this is exactly what git-crypt does. The idea is that you run a filter on every file as it's checked out to do the decryption and every time a file is staged to do encryption. This requires nothing extra on the remote repository and you can you git commands as normal.
On the other hand, git-remote-gcrypt (https://github.com/joeyh/git-remote-gcrypt/) encrypts when pushing and decrypts when pulling, which leaves the local repository unencrypted but keeps it encrypted on the untrusted remote server.
Very interesting article, and well-written walkthrough.
Git is already a great tool with a workflow for handling incremental updates that I happen to be used to working with, so there was no need to reinvent the wheel.
Specifically, he mentions wanting copies of his private keys backed up, but NOT having copies of his private keys in a github repository -- which would be the case with a standard push through ssh.