IMHO second answer does not hold water. If you will end up in situation where you are tortured they will torture you until you will you say how to add the backdoor.
His point is that he can't backdoor it: you can read the code before you install it. I'd go further, and say that this is true of anything end-to-end encrypted, open-source or not, because it's not 2002 anymore and reversing ordinary client software is table stakes. (I'd still rather run something open source, ceteris paribus).
Feeding the paranoia above is that cperciva would verifiably be the smartest person in the room. A canny torturer would respond to this bringing in djb as the primary instrument of torture. "First one to break or weaken scrypt or 8-round salsa20 gains their freedom". The loser is forced to give talks at AWS marketing conferences for the rest of their natural
Not being able to "backdoor it" (presuming this means "exploit a backdoor the torturer presumes you have already put into it") does not prevent you from getting tortured to backdoor it.
All it does is, should that occur, prevent you from giving the torturer what they want to end the torture.
OTOH, convincing the torturer by, among other means, public statements in advance that you have failed to consider this anhd believe that not having that ability prevents torture, and that for this reason you do not have it, might prevent torture. But that's a big gamble on potential future torturers believing your public statements of motivation.
Exploitable but obscured backdoors in software distributed in form that is compiled and installed by downstream users is not impossible, though sufficient auditing may make it improbable.
He probably should have said that if it's what he meant. In his answer he implies that he could in fact back door it but chooses not to because of the liability.
Reversing ordinary client software is table stakes, sure. I'm not so sure about reversing client software which has a deliberately hidden backdoor. (You can hide a backdoor in source code too, of course, but I think it's easier to hide one in a binary because you could e.g. ensure that a buffer overflow overwrites cryptographic keys, where a C compiler would have the freedom to change the memory layout.)
That is to say, it's entirely possible that you were already tortured and the backdoor is already there by using the same logic - no time machine needed
Like already said unfortunately the only safety would be reading the code
Coerce you into sending something like "All users must upgrade to client version xyz because of a backdoor discovered by the NSA in the encryption used in older clients. I'm not allowed to tell you what it is, however, rest assured, the latest versions do not have this vulnerability." (but do have a backdoor that I've been tortured into adding).
And then wait for a scheduled backup with the
backdoored client.
Though XZ says that's impossible, so I won't lose sleep over that scenario.
I am confident that if I sent a message like that, the top application security and cryptography experts in the world would collectively descend on the Tarsnap source code to figure out what changed.
Honestly, I really wish the Tarsnap server was open source. I imagine it has not been released as such because that would probably hurt the business a lot, especially given that the costs per GB are currently approximately 50 times more than I would pay for simple object storage on B2.
I built our company's first backup solution on Tarsnap, but when I projected out what deploying that to our entire fleet would cost, I rebuilt on Restic. We currently pay something like $250/mo for our backups, as opposed to the approximately $12,500/mo they would cost on Tarsnap.
Colin, if you've ever hoped to compete with your own software and providing support to people running your whole stack so they can avoid paying you anything, you should give some serious thought to open-sourcing the whole thing.
Yeah I get it, if one wants to make money off one's software, one shouldn't give it away for free, right? I'm just highlighting why I do not recommend Tarsnap professionally. It's great if you're going to be storing under 1 TB of total backups. Otherwise, you're paying 50x as much as you need to. Back when it was released, the alternatives were not as good. Today, restic seems to work just as well (and yes, I've done restores, both as a test and under real data loss circumstances) and supports object storage natively.
By the way, I absolutely love spiped. It beats the pants off stunnel in both stability and performance. Maybe Colin should close-source that and start charging $0.25/GB for traffic that flows through there too? :P
Consider that Colin's target customers might be paying for things other than raw storage, that most products are poorly marketed with cost-plus pricing, and that trying to make everybody happy is usually a bad plan. Make something that some people love, not something that everybody likes.
He's been doing this long enough, I'm not even prepared to dunk on him for picodollar pricing anymore.
It could be designed that doing so will generate some alarm to other people. For example, the backdoor do not exists and it has to be developed, so the attacker has to keep them hostage for some period of time and loved ones may report a missing person. The software then might have to be signed with a key that generate alert to the whole engineering team, which someone else in the company may investigate the unauthorized release as cyberattack. Perhaps the release signing key is physically stored in the office (eg. Yubikey) which also require the attacker to perform a heist in the office.
Surely some three letters organization probably could pull that off, but it add risk to their operation that the operation could be leaked.
Surely some three letters organization probably could pull that off, but it add risk to their operation that the operation could be leaked.
This is basically a point I've made in a few of my talks about security and cryptography: The point of cryptography isn't to guarantee that your data is safe; it's to raise the cost of an attack to the point where a potential attacker decides not to attack. In particular, there's usually a human involved somewhere (sending or receiving information, or both) and humans are squishy and fragile; but torturing people attracts far more adverse attention than torturing data.
No, he won't, because there is no back door. Or yes, because his torturer-contractor thinks there is. Either way, the last part of your sentence doesn't hold water.