
A few thoughts about Signal’s Secure Value Recovery - feross
https://blog.cryptographyengineering.com/2020/07/10/a-few-thoughts-about-signals-secure-value-recovery/
======
Arnt
I'm quite surprised by the number of people who misunderstand this SGX stuff.

Regarding SGX, Signal is its own attacker. This is a special situation. The
SVR stuff has attackers; I could attack it to get your contacts, but my attack
would not touch SGX. I could try a stream of password guesses, no SGX. SGX is
only relevant if the attacker is Signal itself, presumably ordered by a court.

AFAICT the actual security properties are therefore important, what matters is
SGX' legal properties. SGX servers to let Signal say in court that "this data
is stored in such a way that we can't read it and therefore we can't produce
it in court; just ask Intel".

If there is a secret way to break SGX and the court learns it, then a court
could reveal that to Signal. "Here Moxie, take this super-duper vulnerability
and use it." Eh. Or the court could order Intel to make a microcode update
that disables SGX. Also eh.

~~~
matthewdgreen
I don't know that you really need to put emphasis on "super secret way to
break SGX". Any single attestation private key, combined with cooperation from
Signal (or a compromised Signal operations admin) will suffice. SGX
attestation has now been publicly broken _twice_ in two years, first with
Foreshadow and then with SGAxe.

To give some color to what "publicly broken" means: there's currently a live
Twitter bot put up by UM that signs valid attestations using an extracted
(non-revoked last I checked) SGX attestation signing key [1]. The previous
exploit, Foreshadow, also had one [2]. Enclaves with the very latest microcode
updates will reject those attestations (and Moxie says Signal has installed
those updates, which I believe), but this vulnerability is now several months
old and has been in non-public disclosure for most of that time, long before
microcode updates reached end-customers like Signal. This means that any of a
team of grad students and professors who worked on this paper (and this is
only one such team doing this work) had access to the ability to "super
secretly break" SGX during that time. Based on the team's track record: I
would personally bet $500 that some team regains this capability sometime
during the next 12 months, using an as-yet-undisclosed new exploit.

So we're not talking about "NSA levels of exploitation" here. We're talking
about "can I pay a smart grad student" levels of exploitation. And one such
exploit, combined with Signal's cooperation, means the loss of all backup data
in an irrevocable and untraceable way.

[1] [https://twitter.com/SGAxe_AaaS](https://twitter.com/SGAxe_AaaS) [2]
[https://twitter.com/foreshadowaaas](https://twitter.com/foreshadowaaas)

~~~
Arnt
If an attacker can force Signal's cooperation, why does that attacker need to
worry about SGX at all? Isn't it necessarily simpler to force Signal to change
the client?

The unique thing here is that Signal controls both sides of the SGX barrier.
The strength of the SGX barrier is not terribly important when the same party
is on both sites. Not completely unimportant, but not nearly so important as
when the attacker is someone else.

~~~
matthewdgreen
Every discussion about encrypted messengers has to begin with the assumption
(ideally backed up by actual technical efforts) that the client software is
secure. If you can’t make that assumption, there’s no need for _any_ security
mechanism. You could just as easily say “why use end-to-end encryption, just
send data through the servers in plaintext, since a client compromise could
functionally eliminate any protection offered.”

In practice what designers do is they try to build the most secure system they
can, under the assumption that they can prevent a forced compromise/update of
the client. To do anything else makes the entire problem foolish, because then
nothing works.

In practice, there are real (if not perfect) defenses against a forced client
compromise. From a technical perspective, Signal uses reproducible builds, at
least on Android. They are also legal: we haven’t seen any court case where a
US company has been forced to push software updates, although we do see
routine attempts by law enforcement to gain data from back-end systems —- eg
Yahoo, Lavabit.

Nobody says Signal’s client defenses are perfect, and we aren’t yet in a world
where client compromise is absolutely ruled out. But the problems of designing
a secure back-end and a secure client are largely separable. You can work on
both at the same time, and being imperfect in one area does not justify a bad
solution in the other.

------
lunixbochs
Android [1] and iOS [2] both have APIs and UI to associate a custom instant
messenger protocol entry, e.g. "Signal": "signalusername" with a real contact.
SVR is thus not required to store Signal handles in my phone native contact
list, so why do Moxie and others keep saying this is somehow necessary?

[1]
[https://developer.android.com/reference/android/provider/Con...](https://developer.android.com/reference/android/provider/ContactsContract.CommonDataKinds.Im)

[2]
[https://developer.apple.com/documentation/contacts/cncontact...](https://developer.apple.com/documentation/contacts/cncontactinstantmessageaddresseskey)

~~~
tptacek
People expect a consistent "buddy list" of Signal contacts on all the
platforms they use, and across reinstalls.

~~~
anigbrowl
_Some_ people do. While this makes makes things much more convenient for
cross-platform use or the case where someone loses their phone and doesn't
want to lose access to all their signal data, it lowers security (conceivably)
and trust (definitely) when they're forced to store the contact information
through the requirement of a PIN.

~~~
tptacek
Though the article would (I guess unintentionally?) lead you to believe
otherwise, you are not in fact forced to store your contact information with
Signal.

The word "some" is doing an interesting amount of work in your comment,
incidentally, since every other mainstream messenger just stores this stuff in
something like a serverside SQL database.

~~~
anigbrowl
I based that belief not on the article but the fact that Signal threw up a
modal dialog the other day and forced me to add a PIN which I had been
postponing doing due to misgivings about the SGX thing.

 _The word "some" is doing an interesting amount of work in your comment,
incidentally, since every other mainstream messenger just stores this stuff in
something like a serverside SQL database._

It's only doing the work of disagreeing with your blanket assertion. I'm not
now and nor have I ever gone about telling people they need to get off Signal
and use [something else], so it's unclear to me why you'd read that into my
comment.

~~~
tptacek
I'm reacting to the "lowered trust" in your comment. I take exception to the
idea that Signal has spent down any of their accumulated trust on this. What's
especially galling to me is that this is Signal taking steps to resolve the
biggest complaint nerds have about the platform, and then getting attacked for
it.

~~~
lunixbochs
It's unclear if Signal is already storing contact lists serverside or not with
the pin change, and not having clear messaging around that is part of the
problem. Also Moxie suggested in the twitter thread that even if you disable
the pin you'll end up with a device-generated key and contacts will still
sync.

~~~
tptacek
They spent the last 6 years making people use phone numbers instead of
usernames in order to avoid storing contact lists serverside, and now you
think they're doing it just for the hell of it?

------
tptacek
I don't like SGX at all and wish Signal wasn't using it (they presumably could
do something more like what Apple did with their HSM cluster system, which had
exactly the same purpose of defeating brute-force attacks on low-entropy
secrets).

But this is a weird article. The "Addendum" moots much of what precedes it,
and both the main body of the article and the Addendum postdate the debate
Green and Moxie had on Twitter. The whole thing appears to boil down to:

* Yes, every one of Signal's mainstream competitors store in serverside plaintext the information Signal protects with SGX, meaning that none of the microarchitectural attacks Green discusses could reduce the security of Signal below that of realistic alternatives.

* And, you can now disable PINs on Signal altogether (even before doing that, you could effectively disable them by setting and forgetting an intractable passcode, but I agree that UX would have been inadequate).

* So what we're left with is the claim that Signal could in the future enroll more data in SGX beyond contact lists.

Which of course, they can, and always could have. As the article hints early
on, other messaging platforms have no qualms storing message data in places
that provide something much less than end-to-end-encrypted security
guarantees; Signal could for the last several years have been implementing
those same user-pleasing features without SGX without risking security parity
with other messengers. They didn't. Why am I meant to believe they're going to
do it now?

You're in a really weird place when you're arguing that Signal, the messenger
that requires people to use phone numbers as identifiers (that this infuriates
every nerd on the Internet is not lost on Moxie and Trevor), that has an
encrypted tunnel set up just to share GIFs to foil traffic analysis and
wouldn't roll that feature out without it, that only just in the last 2 years
got _user profiles_ because of the privacy concerns --- when your core
argument is that this team has somehow indicated bad faith about user privacy.
I mean, it's just the opposite.

(I'm no more clued in than anyone else on this stuff but my take on Signal's
excitement about contact lists is that it's a step towards providing something
other than phone number identifiers, which isn't just a big deal on the order
of message backups, but maybe the biggest possible deal with respect to the UX
of the service. But I have nerd bias on this, because I hate using phone
numbers. It's possible it's less of a big deal than I think, and that ordinary
non-nerd uses DGAF about phone number identifiers, since it's how they all
message already.)

~~~
matthewdgreen
Signal spent months and years designing a system for backups that had no opt-
in or opt-out feature. They deployed it over a period of months, ignoring a
lot of vocal criticism from people who said "this is fine, but why aren't you
letting me opt out?"

Finally, at the 11th hour, they hacked in an opt-out feature. Note that the
opt-out feature doesn't even stop data from going to Signal's servers --
preventing this upload appears to be technically infeasible! Rather, the hack
is to send the data anyway, but encrypt it under a random key.

It does not bode well for Signal's security engineering and design philosophy
that "don't upload data to a server, even when users understand and accept the
consequences of that" is not something Signal even considered supporting, and
their design is completely incompatible with the idea.

If you want to think of this article as a post-mortem on a thing that went
wrong at Signal, that's fine with me. Claiming that it shouldn't be written is
just silly.

~~~
tptacek
I don't believe anyone is really reading this thread and so it's probably not
worth our time to further litigate this, but the criticism you're making here
is of a kind with what I described in my comment: that I'm supposed to believe
that Signal is either working in bad faith or is negligent with user data,
when all the evidence to date strongly suggests the opposite thing: that no
other project of comparable popularity is as diligent as Signal is about not
exposing user data, even at significant cost to UX and uptake.

I don't understand "claiming it shouldn't be written", by the way. That's not
my claim. My claim is that it is written weirdly.

~~~
matthewdgreen
If my post indicated that Signal is working in bad faith, please let me know.
That wasn't my intent and I went to a _lot_ of trouble to signal that.

Even people who have done a good job in the past can make mistakes, and this
post was written to highlight what I think was a mistake.

ETA: I'm not writing here because I want to have a public argument with you.
That's what Twitter is for. I'm curious because you seem to have strong
feelings and I respect your opinion, even if I often disagree with it.

~~~
tptacek
As I said in first comment I wrote here, because of the way you wrote the
piece, with an addendum that basically refuted most of what you had to say in
the long windup of microarchitectural side channel explanation and getting
beers with Daniel Gruss, indications of bad faith from Signal were essentially
all you communicated.

~~~
matthewdgreen
I re-read the addendum. Could you point me to the part that refutes the idea
that microarchitectural side channel exploitation represents a threat to SVR?
Cause I think you and I both agree that it does.

As far as I can tell, all the addendum says is "I complained a lot and as a
result of a direct conversation with Moxie he hacked in a really jury-rigged
opt-out to the system, also he disagrees with me that a compromise of this
system would be a big deal."

~~~
tptacek
How many paragraphs into your post are we before you acknowledge that this
feature, which is universally present in other secure messengers, can simply
be disabled? Is it before or after you discuss how hard it is to disable
message backups in other messengers, a feature that Signal doesn't even have?
Before or after your beers with Daniel Gruss?

(You asked, which is why I'm pointing this out; I'm comfortable with how my
original comment reads, and that comment didn't suggest that nothing like this
should have been written, or accuse you of bad faith).

~~~
matthewdgreen
The first 60% of the post is a technical explainer of SVR, and SGX which it
relies on. I've been meaning to write this post for a while, and it doesn't
become less relevant because Moxie hacked in an opt-out -- SVR is still
present in Signal, and therefore it's still technically interesting. It will
likely become _more_ technically interesting over time.

The first paragraph that I think you could actually consider a criticism of
Signal is this one: "My major issue with SVR is that it’s something I
basically don’t want, and don’t trust. I’m happy with Signal offering it as an
option to users, as long as users are allowed to choose not to use it.
_Unfortunately, up until this week, Signal was not giving users that choice_."
(I added the emphasis here, just to make clear you don't have to read to the
addendum for this information.)

I guess we'll have to disagree on this one. It seems like you view this as an
attack on Signal, and I view it as a tech post that contains a mild criticism.

~~~
tptacek
See, I absolutely believe that's why you wrote it this way, because it's how
I'd first approach writing this too: there's a fun explain-y part and then a
"why does this matter" part, which I think is the schematic for basically
every blog post I've written.

But I'm telling you that's not how it reads, especially after you had a
Twitter fight with Moxie, which makes this read as your postmortem to that
argument --- but even for readers unfamiliar with that drama, this is the most
critical you've been of Signal on your blog; at one point late in the post you
even suggest that they might be dishonest about their intentions with the
feature. Even HN, which is not especially inclined to be charitable about
Signal (this is Matrix country) can imagine better reasons why SGX might be
sensible in this design than your post does.

If you don't want it to read that way, you should bring up some of the content
from the addendum, even though it does lessen the impact of your side-channel
stuff when readers know that they can just opt out of it.

~~~
matthewdgreen
I suggested that Signal might deploy an incredibly powerful general-purpose
backup system, and then use that backup system for purposes other than storing
the social graph. And Signal claims that is not their intent, but they don't
rule out storing other thing either. They have already hinted at bigger plans
for this system, though not specifically.

I don't think that this is "accusing them of dishonesty", I think it's common
sense. And would be happy to bet you money that I'll be right.

~~~
tptacek
Ok. I think we're more invested in dissecting why I thought your post was
written weirdly than an accusation of "weirdness" really merits, and am
comfortable just disagreeing here.

------
upofadown
The SGX enclave thing doesn't have to be really all that secure to solve a
problem for signal. All it needs to do is be secure enough so that they can
reject things like subpoenas. It changes the required level of cooperation
from passive to active.

