
Binary Transparency for Firefox - _jomo
https://wiki.mozilla.org/Security/Binary_Transparency
======
aeijdenberg
If I'm understanding correctly, the plan is to piggy-back on top of the
existing Certificate Transparency [0] infrastructure by issuing a regular X509
certificate per Firefox release, but for a special domain name that includes a
Merkle tree hash for the files in that release, with a known suffix (".fx-
trans.net").

In that manner they can piggy-back on top of the CT ecosystem (including
existing logs, including existing search / monitoring tools, and presumably
gossip if/when that's solved).

This seems like a really cool hack! The state of binary software distribution
is really pretty scary when you think about it - techniques like this have the
potential to restore a lot of confidence.

[0] [http://www.certificate-transparency.org/](http://www.certificate-
transparency.org/)

~~~
kbenson
_Specifically, Certificate Transparency makes it possible to detect SSL
certificates that have been mistakenly issued by a certificate authority or
maliciously acquired from an otherwise unimpeachable certificate authority. It
also makes it possible to identify certificate authorities that have gone
rogue and are maliciously issuing certificates._

Interesting. I assume this either helped with the evidence for - or was
developed because of - the whole Symantec CA dustup going on?

~~~
aeijdenberg
CT significantly pre-dates the recent Symantec issues, but yes, it does
provide an excellent tool for providing evidence of misissuance [0] [1] - and
that's the crux of it - in order for a certificate to be considered valid in a
CT world, it must present proof that it has been publicly logged.

[0] [https://security.googleblog.com/2015/09/improved-digital-
cer...](https://security.googleblog.com/2015/09/improved-digital-certificate-
security.html) [1]
[http://searchsecurity.techtarget.com/news/450411573/Certific...](http://searchsecurity.techtarget.com/news/450411573/Certificate-
Transparency-snags-Symantec-CA-for-improper-certs) [2]

------
dane-pgp
This is a fantastic step forwards for Binary Transparency, which I hope is
followed by Linux distros and package managers, so all Free Software gets the
benefit.

The one worry that comes to mind, though, is that once a binary transparency
log check is made mandatory for any update to a piece of software, there is a
risk that a bug in the log checking code makes it impossible to ever upgrade
the software again. (This reminds me of the HPKP Suicide attack, but is not
quite the same).

Obviously it should be possible, with Firefox at least, to manually download a
new copy of the installer and install it from scratch, but I feel there should
be a fall-back mechanism where, say, a release signed with a special offline
key should be allowed to skip the transparency check (perhaps only if the
transparency check has been failing on an offered upgrade for more than a
month).

~~~
jopsen
A bug is the log checking code is no worse than a bug in your signature
verification code. But obviously, denial-of-update attacks on the log
infrastructure should be mitigated in some way before this is mandatory.

An up-side to not having a fall-back mechanism is that you can't produce a
secret update. No matter many 5$ a wrenches the NSA can afford
([https://xkcd.com/538/](https://xkcd.com/538/)).

------
aanm1988
I wonder how much effort it would take them to actually get to fully
reproducible builds.

edit:

here,
[https://bugzilla.mozilla.org/show_bug.cgi?id=885777](https://bugzilla.mozilla.org/show_bug.cgi?id=885777)

~~~
rebelwebmaster
As long as PGO (profile guided optimization) is still a thing that yields a
significant speed boost, a lot.

~~~
AgentME
The profile data used could be published.

------
ComodoHacker
This is all good and nice, especially if reproducible builds come true. But
the devil is in the extensions. They are the weakest link.

------
bigbugbag
Binary transparency seems to be a nice thing to have though quite limited in
scope for linux as distro usually compile from source. Even more limited as
mozilla knowingly makes controversial choice stating unhappy users and distro
can recompile with a build flag until we strip the code.

IMHO mozilla should orient its transparency effort towards its decision
process first so we don't end with a binary transparent browser no one use
because management decided to remove user choice and break the UI (to look
more like chrome), break extensions that contributed to firefox success (to be
more like chrome), require pulseaudio and drop alsa and so on.

~~~
Vinnl
All the examples you mentioned were communicated pretty transparently in
Bugzilla, developer blog posts and announcement blog posts. Just because you
disagreed with them doesn't mean that the reasoning wasn't public and that you
couldn't have contributed to it.

------
snakeanus
Why not just use openpgp signing instead like most GNU/Linux distros?

~~~
skybrian
Signing a binary allows the updater to verify that it came from a trusted
source, but it doesn't tell you whether they gave you the same binary as
everyone else, or a custom one.

Binary transparency ensures that there's a complete, public list of all
updates sent out. It's an additional level of verification showing that the
source isn't up to any shenanigans.

It's more of a deterrent though. It doesn't prevent sending a custom update;
it just makes it difficult to hide.

------
DorothySim
Really interesting hack. It basically gives (almost) free timestamping (using
Let's encrypt for cert issuance and CT logs for storing information).
Previously one would use Bitcoin OP_RETURN outputs for timestamping [0].

[0]:
[https://en.bitcoin.it/wiki/OP_RETURN](https://en.bitcoin.it/wiki/OP_RETURN)

------
TorKlingberg
Is Debian / Ubuntu doing anything like this?

~~~
pietroalbini
There is an effort to make all the packages' builds riproducibile in Debian,
which should automatically propagate to Ubuntu for most of the packages

~~~
smcl
Funny aside - the debian reproducible build effort has uncovered bugs like
this: [https://bugs.debian.org/cgi-
bin/bugreport.cgi?bug=848721](https://bugs.debian.org/cgi-
bin/bugreport.cgi?bug=848721)

------
gary4gar
What's wrong with just doing SHA1?

~~~
simplehuman
Sha1 collisions...

~~~
aanm1988
what's wrong with doing <insert hash function>?

~~~
aeijdenberg
Making a hash of the release is just a small part of it (and is the first part
of what they are doing).

The trick is to be confident that you're getting the same hash as everyone
else - and that's what requiring a proof that it be added to a CT logs gives
you some level of assurance about.

------
copper_rose
The stated goal is to enable someone to verify "that they have gotten the same
version as the rest of the world and not a special, possibly compromised
version." This is actually two goals: (1) verify that your version is the same
as everyone else's, and (2) verify that that version is genuine.

Why should one care about (1)? All that really matters is (2). As long as I'm
using a genuine release, does it matter what the rest of the world is using?
Unless I wish to establish trust in a binary based on how popular it is, or
unless I care about interoperability between the version I have and the
version others have, it doesn't really matter what version everyone else has.

I wonder if the author has heard about Nix or Guix? The purely functional
software deployment model pioneered by Nix solves (2) trivially, for
practically all applications in general, not just Firefox specifically. It
also solves many other problems in the field of software deployment that this
article doesn't even mention.

Long story short, don't reinvent the wheel. Use Nix or Guix. Learn more by
reading the first chapter of Eelco Dolstra's thesis, which describes the
problems and how the Nix model solves them:

[https://nixos.org/~eelco/pubs/phd-
thesis.pdf](https://nixos.org/~eelco/pubs/phd-thesis.pdf)

Edit: Even if one is concerned about (1), the Nix model enables ways to verify
that the origin is actually sending a binary that was built from the source it
claims to use. For example, consider "guix challenge":

[https://www.gnu.org/software/guix/manual/html_node/Invoking-...](https://www.gnu.org/software/guix/manual/html_node/Invoking-
guix-challenge.html)

~~~
kibwen
The reason to care about getting the same binary as the rest of the world is
that it increases the likelihood that an attack will be detected.

In the case with neither binary transparency or reproducible builds, a
nefarious actor can target a single user with a tainted binary and it's
unlikely that the user will find out and difficult for them to rule out the
possibility of tampering up-front.

In the case with binary transparency but no reproducible builds, a nefarious
actor must target all users which makes it more likely that someone will
notice, but still difficult for people to rule out tampering up-front.

In the case with reproducible builds but no binary transparency, it's easy for
people who are paranoid to rule out tampering with the binary, but people who
aren't paranoid are unlikely to discover that their specific binaries were
tampered with, so a targeted attack will still probably go undetected.

In the case with both reproducible builds and binary transparency, it only
takes one paranoid person discovering a tampered binary to alert the whole
world that their own binaries have been tampered with. It's safety in numbers,
even for those not technically-literate enough to determine (or even suspect)
tampering.

~~~
copper_rose
Thank you for the clarification. I can see from your examples why binary
transparency is a useful concept worth considering in its own right. I still
suspect there is a huge amount of overlap between the problems the author is
trying to solve and they ones that Nix/Guix has already solved (especially the
way they want to use a hashing algorithm to identify the release). I'll bet a
general solution for binary transparency could be built - a solution from
which practically all software in general could benefit, not just Firefox in
particular - by building on top of (or at least learning from) the base that
the purely functional software deployment model, as pioneered by Nix, has
already given us.

I am not simply saying "They should use Nix" as if that would magically
accomplish their goals. I am saying that they could build on top of, or at
least learn from, the novel techniques that Nix has contributed to the field
of software deployment.

~~~
aaronlevin
One of the people involved in the reproducible builds project is a NixOS
committee. Fairly certain they're aware of nix/guix

~~~
copper_rose
Does the reproducible builds project have a hand in the project to give
Security/Binary Transparency to Firefox? I ask because i don't know, and I saw
no language to suggest that in the page linked.

