
Most software already has a “golden key” backdoor: the system update - discreditable
https://arstechnica.com/security/2016/02/most-software-already-has-a-golden-key-backdoor-its-called-auto-update/
======
stavrianos
But, at _some point_ there must be trust. If you don't trust software, you can
try to sandbox it, but now you have to trust the sandbox. This devolves very
rapidly. Open source at least provides a facsimile of recourse - just go read
the code - but how much of your currently-running open-source code have you
actually read? For that matter, if you had, could you be confident that you'd
understood it? The Underhanded C Contest is a thing, after all. A sufficiently
paranoid individual can only run code they wrote themselves. Or, they can
choose to run code without a strong understanding of what it's doing.

If Apple subverts their updates, that's mostly interesting as a signal of
their trustworthiness moving forwards. The coolest thing about this is that we
know it's happening at all, I think.

~~~
jMyles
> Open source at least provides a facsimile of recourse - just go read the
> code - but how much of your currently-running open-source code have you
> actually read?

Wait, what? That is _not_ the recourse that open source provides.

The great thing about open source is that you don't need _every_ person to
read the code, just _one_ person who can either catch or verify the absence of
user-abusive material.

Moreover, even if _zero_ people read the code today, it is preserved so that
state (or corporate) abuse can be revealed later, providing another
disincentive to introduce abusive material.

~~~
marcosdumay
How do you know you are running the same software that everybody else is
reviewing?

~~~
dankohn1
Please see the Reproducible Builds project, a vital contribution to answering
this question.

[https://reproducible-builds.org/](https://reproducible-builds.org/)

~~~
marcosdumay
Reproducible builds do not cut it. If you have a known good binary signature,
you'd also have a known good source signature and wouldn't have the problem.

Or, to put it in better words, where do you get the certificate to check your
build from? At extreme paranoia levels, you simply can never be sure you have
the same software as everybody else, thus the only safe alternative is
reviewing your copy yourself.

(How do you know the computer is showing you the correct contents of your
files? Didn't think that well enough yet.)

~~~
dankohn1
Diverse Double Compiling is a proven solution to the On Trusting Trust problem
[0]. So, if a package maintainer signs a package and posts that signature on
an https page, I can have a high level of confidence that the software I
compile and run on my machine is identical.

Here is some advice from Schneier on running secure software against a state-
level adversary [1][2]. However, even that is not immune from a black bag job
[3].

[0] [http://www.dwheeler.com/trusting-
trust/](http://www.dwheeler.com/trusting-trust/) [1]
[https://www.schneier.com/blog/archives/2013/10/air_gaps.html](https://www.schneier.com/blog/archives/2013/10/air_gaps.html)
[2]
[https://www.schneier.com/blog/archives/2014/04/tails.html](https://www.schneier.com/blog/archives/2014/04/tails.html)
[3]
[https://en.wikipedia.org/wiki/Black_bag_operation](https://en.wikipedia.org/wiki/Black_bag_operation)

------
userbinator
Remember when software didn't need updates almost every day? It seems like
we've regressed in terms of quality and the general principle of _doing it
right the first time_. I understand that some things do change and bugs do
occur, but I don't think _everything_ needs to, nor should the rate of bugs
being found be anywhere near what it is today.

Personally, I prefer stability over "new features", turn off automatic
updates, and read changelists carefully. Anything which doesn't have a good
description and rationale of _why_ it needs to be changed, and how that is
relevant to my usage of the software, doesn't get changed. (And software which
doesn't give me the option of doing so, doesn't get used either.)

~~~
eponeponepon

      > It seems like we've regressed in terms of quality and the
      > general principle of doing it right the first time.
    

This is what Agile ideologues have brought us. "Valuing working software over
documentation" does not, in my experience, tend to result in workable
development, and that just doesn't lead to working software.

~~~
xlayn
There is no way of guaranteeing doing it right the first time. Most of the
methodologies of software development take the iterative process as a measure
of caution, prevention or risk avoidance similar processes.

In fact CMMi mentions the importance of measure, iterate and fix as soon as
you can... which is the response to the premise of being unable to "just do-it
it right"

~~~
forgottenpass
I don't think I get where your post is going. When I hear some combination of
"getting it right" and "first time" I don't presume that the development team
only gets one crack at something. Instead, it just sounds like iterating
internally on something until it's ready without cutting corners.

~~~
xlayn
It goes to show that the poster idea

>Agile ideologues -> just doesn't lead to working software

it's a flawed idea; if you want to take agile core values all of them are
focused on actually trying to reduce errors and improve quality [0].

On the other hand it's a very broad generalization, and leaves the open
question "what does actually lead to working software?"

[0] [http://www.agilemanifesto.org/](http://www.agilemanifesto.org/)

~~~
eponeponepon
On a small point of order, I'm slightly worried that you might have read
"idealogues" as a typo for "ideologies" \- very much not the case! I very much
meant ideal _ogues_.

I don't take much issue with the underlying ideas of the Agile manifesto -
it's the ritualisation and thoughtlessness that it can lead to that irks me.
The "anti Agile manifesto", many though its faults are, rings true in a lot of
places, to my mind: "backlogs are really just to do lists" seems pretty much
inarguable, for one.

------
whoopdedo
The important difference between a FLOSS system like Debian versus the walled
garden of Apple is the ability to choose which keys you trust. If I lose trust
in one of the Debian signers I can remove that key. I can select a different
repository to download packages from. Or I can stop accepting updates and
apply patches manually.

As to a key being a single point of failure, PGP allows for multiply signed
documents. Couldn't Debian require packages to be signed by at least two keys?

------
studentrob
> Is it reasonable to describe these single points of failure as backdoors?

Yes. The case that Apple is making is, in part, that the FBI is forcing them
to _use_ a back door by putting Apple's stamp of approval on it with their
signing key. Whether or not you agree that Apple must _create_ the back door
or not, they are still being asked to approve its use. Apple says this is
compelled speech and violates the first amendment. Their brief has more
details about it. The Tech Dirt summary has the most details [1]. Here's one
notable passage,

 _“The government asks this Court to command Apple to write software that will
neutralize safety features that Apple has built into the iPhone in response to
consumer privacy concerns... This amounts to compelled speech and viewpoint
discrimination in violation of the First Amendment.”_

> I think many people might argue that industry-standard systems for ensuring
> software update authenticity do not qualify as backdoors, perhaps because
> their existence is not secret or hidden in any way.

It's relative. That used to be somewhat hidden. Now it's very out in the open.

> Having access to a "secure golden key" could be quite dangerous if
> sufficiently motivated people decide that they want access to it.

Yeah. So let's not compromise Apple's existing security procedures by forcing
that out in the open.

> I expect that in the not-too-distant future, for many applications at least,
> attackers wishing to perform targeted malicious updates will be unable to do
> so without compromising a multitude of keys held by many people in many
> different legal jurisdictions.

I hope this day comes soon. For now, let's continue fighting for our right to
privacy.

[1] [https://www.techdirt.com/articles/20160225/15240333713/we-
re...](https://www.techdirt.com/articles/20160225/15240333713/we-read-
apples-65-page-filing-calling-bullshit-justice-department-so-you-dont-have-
to.shtml)

------
xlayn
>"Being free of single points of failure should be a basic requirement for any
new software distribution mechanisms deployed today."

As in write error-free software?

------
facetube
The article claims that forcing Apple to write the software "isn't a big deal,
as they could pay someone to do that". If the software malfunctioned and/or
erased evidence on the device, who would be liable?

~~~
xlayn
Well, the article mentions that; but the main point is update mechanisms are a
single point of failure, because it's by design a way of delivering a change
that will be applied an run as a root.

On the other hand it does also lacks a reference or numbers on how many times
it has been exploited.

~~~
natch
That's why updates only happen when the user accepts the update after
authenticating themselves.

~~~
DavideNL
Exactly, that's the part i don't understand yet...

Why doesn't Apple create a mechanism which "only allows updates to be applied
after the correct pin is entered"? Then, an update created by the FBI, which
would disable security mechanisms, could not be installed (without knowing the
correct pin).

------
natch
This is why EFF urges that software makers (applies to makers of hardware too,
if there is a software component) should always keep the end user in the loop
of deciding whether to accept updates.

------
vonklaus
this is totally correct. I agree with everything he says. However, in a world
where everyone uses:

Android

Chrome

Google DNS

Google Analytics

Google Authenticator

Google Certificates (issue and validate)

Hangouts

Gmail

Google Docs

Many machine learning is run on google code using google opensourced software.

search

chromecast

maps

google connect (starbucks, other retailers)

google domain registration

google chrome/webkit

google wallet

google playstore

google CDN for serving code/fonts/libraries

google self driving cars are coming

=================================

I am not saying google is bad. Not even referencing this article because I
think it was well thought out and there aren't that many people even saying
this, although it is fairly intuitive, but the larger paradox is this:

Many people talk about some database being sloppy and how they would have it
replicated, maybe even a cold backup in AWS glacier on top of the backups, and
yet the world is using like 3 stacks: google, apple and microsoft.

Outside of this community, and a few other places, it would be highly unlikely
someone is running their own openwrt router and have freebsd on their
computer, fully encrypt all their emails and run their own mail server, don't
own a cellphone and use garmin to navigate AND

all of their network does this too and therefore they are insulated
sufficiently.

------
cmurf
Over the weekend, Apple pushed an update to a kernel extension exclusion list
that broke ethernet for thousands of users. There was no notification of this
update, it simply got pushed and installed, suddenly ethernet broke - as in,
not even visible to the system because the ethernet driver (kext) was
disallowed from loading.

So... yeah it's a mistake, annoying. But we're in some sense expecting much
more privacy, security, reliability of our mobile devices than our desktop
systems. And I think that's an interesting shift in expectations.

------
mtgx
Maybe it's time to ask Google to ask for permission before updating Chrome?

------
ape4
A system update is how Apple would install a backdoor, if they were forced to.

