
Reproducible Signal builds for Android - qznc
https://whispersystems.org/blog/reproducible-android/
======
JoshTriplett
> Reproducible builds help to verify that the source code in our GitHub
> repository is the exact source code used to build the compiled Signal APK
> being distributed through Google Play.

This is huge; it eliminates one of the biggest issues with distributing
through third-party app stores.

> Just to head off the inevitable deluge of GPG encrypted emails with dramatic
> subject lines, we are not doing this in response to any kind of legal threat
> or presssure. This is just a weekend hack, please don't make us regret it.

I wonder what kinds of mails like these they've received in the past to prompt
this disclaimer?

~~~
sigmar
>I wonder what kinds of mails like these they've received in the past to
prompt this disclaimer?

I believe he is being a bit tongue-in-cheek. Moxie has previously mentioned
his dislike of the typical emails he gets from the type of people that use GPG
([http://www.thoughtcrime.org/blog/gpg-and-
me/](http://www.thoughtcrime.org/blog/gpg-and-me/))

~~~
scintill76
And a more recent statement: "Was reflecting on how the quality of my email
inbox had become less awful lately, then remembered I added a rule to drop pgp
encrypted mail."
[https://twitter.com/moxie/status/709492776635752448](https://twitter.com/moxie/status/709492776635752448)

~~~
Freak_NL
Reading that, and the blog post linked to by sigmar, his stance comes across
as rather belligerent; perhaps even childish. I would expect that if you
wanted to communicate with someone who is privacy-conscious, and provides a
public e-mail address linked to a GPG-identity, encrypting and signing your
message to him or her is only civil and is to be expected.

I understand Moxie's criticism of existing crypto-tools, GnuPG in particular,
and he makes some valid points, but dismissing anyone who mails you and uses
GPG (because you have published your GPG public key) as nutjobs seems overly
antagonistic. That to me seems at odds with the greater goal of facilitating
easily attainable privacy and digital freedom for all.

------
ge0rg
There are two trust problems that verifiable builds are supposed to solve:

1\. Did the authors manipulate the source code compared to what they
published?

2\. Did a third party manipulate the binaries on the distribution channel?

 _The process of verifying a build can be done through a Docker image
containing an Android build environment that we 've published._

For the verification, you now depend on a _complex_ binary blob provided by
the authors, that is distributed through a different channel (Docker images
instead of Google Play).

This is a good solution to the second problem, but it does not preclude OWS
insiders from injecting malicious code (they merely need to add the backdoor
at the SDK level[0] and use that same SDK for the public releases). Such a
manipulation could be performed by an evil insider, or be part of a
"government cooperation". I am not saying that OWS is or will be doing this.
This is merely an observation of the shortcomings of the overall solution.

[0] "Reflections on Trusting Trust"
[https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...](https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf)

~~~
haffenloher
> For the verification, you now depend on a complex binary blob provided by
> the authors, that is distributed through a different channel (Docker images
> instead of Google Play).

Nothing stops you from building the Docker image yourself using the Dockerfile
provided in the repo. [https://github.com/WhisperSystems/Signal-
Android/blob/master...](https://github.com/WhisperSystems/Signal-
Android/blob/master/Dockerfile)

~~~
davexunit
But those Docker images are _not_ reproducible. They don't build everything
from source from a trusted, well-known set of bootstrap binaries, nor will
they produce bit-identical binaries. Docker does absolutely nothing to aid in
the task of reproducible builds. See [https://reproducible-
builds.org](https://reproducible-builds.org) for more information about
initiatives that are helping.

Using something like GNU Guix instead of Docker, one could make good progress
towards a reproducible Android tool chain that produces bit-identical APKs.
Reproducibility is an ongoing problem, but Guix has been carefully designed to
maximize reproducibility and to help identify what _isn 't_ reproducible.

------
smartbit
As much as I like this, I'm refusing to use Signal for Moxie's[1] stance on
requiring access to the address book on iOS [2].

Regretfully I'm not skilled enough to modify the code and create a branch.

[1]
[https://news.ycombinator.com/item?id=11288169](https://news.ycombinator.com/item?id=11288169)

[2] [https://github.com/WhisperSystems/Signal-
iOS/search?utf8=&q=...](https://github.com/WhisperSystems/Signal-
iOS/search?utf8=&q=ADDRESSBOOK_RESTRICTED_ALERT_BODY)

~~~
simoncion
Oddly, the Android version _seems_ to work just fine if it doesn't have access
to your contacts. (Like -ferinstance- if you use Cyanogenmod's Privacy Guard
stuff to deny access.)

I don't _know_ , but I've heard Internet Rumors that OWS has been having
difficulty finding folks to work on the iOS version of Signal.

Not that you asked for it, but my stance on the read-contacts issue is this:
if I can't trust OWS to treat the contacts information that the Signal client
transfers to their server as confidential, then I sure as fuck can't trust
them to actually resist the urge to subvert either their client or their
server code (whether for financial gain, or because someone with a gun comes
knocking). Given that I trust OWS to act with integrity, I don't see any
significant harm in how they currently handle contact data.

~~~
nucleardog
CM's privacy guard doesn't deny access to the contacts, it returns a valid
address book listing... just an empty one instead of your real contacts.

Privacy Guard was implemented well before the idea of individual permissions
being denied made its way into the Android base, so most apps were not
implemented with this expectation and would explode completely if the call to
fetch the address book entries failed.

Which is just to say that that's not really a valid comparison.

~~~
simoncion
> CM's privacy guard doesn't deny access to the contacts, it returns a valid
> address book listing... just an empty one instead of your real contacts. ...
> Which is just to say that that's not really a valid comparison.

Wot? It's _totally_ a valid comparison.

The problem is transmission of contact information from your phone to a third
party. Telling Signal that you have an empty contact list solves that problem,
and it solves it _far_ better than just throwing a permissions error or
whatever when the software goes to request a contacts list.

~~~
nucleardog
> Oddly, the Android version seems to work just fine if it doesn't have access
> to your contacts.

You're specifically calling out CM's Privacy Guard where the app is unaware
that it doesn't have access to your contacts. the difference in behaviour
between "Android" and iOS is not "odd" because it's not a 1:1 comparison, and
the CM way of doing it specifically mitigates the issue that arises.

A consistent comparison would be seeing what the app does when permission is
denied via the runtime permissions in Android 6, because that - like iOS -
informs the app that you've denied permission. There is no obvious incongruity
between the apps here.

~~~
simoncion
> A consistent comparison would be...

Ah! I see what you're saying (and the source of your objection) now. Thanks
for clarifying!

