
Apple Confirms “Back Doors”, Downplays Their Severity - gulbrandr
http://www.zdziarski.com/blog/?p=3466
======
joosters
What's really disappointing is that there seems to be an all-or-nothing
security model here. If I pair my phone with a computer, then suddenly it has
complete access to spy on me, install monitoring tools that can continue to
run, etc. Why can't there be a way where I can transfer music/photos to/from
my phone without providing this full device access?

You'd be pretty annoyed if the front door to your house, when you opened it,
also opened up your document safe, emptied your wallet onto the floor and
invited visitors to leave bugging devices to spy on you later.

Also, the defence of "just don't agree to pair your phone with an unknown USB
device" can actually be tricky. On a flight, I plugged my phone into the USB
port on the seatback to charge it. The phone repeatedly kept asking if I
wanted to pair it with something (who knows what it was? the entertainment
system, maybe?). If I had accidentally hit the wrong button only once (on a
prompt that randomly appeared), my phone could have been owned, and there's no
easy way to un-pair.

~~~
fredsted
This is what is worrying me the most.

Why can something even repeatedly ask for permission? (My iPhone was asking me
every 5 seconds the other day due to a faulty cable.) Even if there's a reason
for that, why isn't there a "don't ask again" button?

Why doesn't such a thing need a pin code or iCloud password entry?

Why isn't services like file_relay or pcap a setting deep inside the Advanced
section of Settings.app that requires password entry on enabling and features
a warning message?

These things are advanced features. Few use them, why not make it a little
more difficult?

Why can't I opt-out of Apples access to my files? Sure, I can say "No" when
they ask at the store, but I could say no as well in the Settings app.

It's easier to enable packet capture than complete the Provisioning Profile
process when releasing an app update to App Store.

These are the things Apple should be answering instead of the vague support
note entry they published today.

~~~
joosters
I think my iPhone was repeatedly asking about it for similar reasons to yours,
as the connector is worn and it can take several plug attempts to make it even
charge. However, a malicious USB socket could cause repeated prompts by just
briefly dropping the power from time to time to trigger it. (n.b. I'm sure my
experience wasn't a sinister attack on my phone, just stating that this is
_possible_ to do)

The "Don't ask again" is a little tricky - is there enough information present
for the phone to tell if it is plugged into the same or different computer?

A better UI would be for the phone to _always_ default to not pairing. No
popup choice would be shown at all. After all, how often do you need to pair
to a new computer?

------
mnem
His work on security in iOS is quite interesting, but he seems determined to
spin everything for maximum publicity rather than, well, accuracy or truth,
which is a shame. For example, on that blog post he writes about pcapd and
developers:

    
    
        "Lets start with pcapd; I mentioned in my talk that pcapd has many legitimate uses such as these"
    

Yet in the slides for his talk[1] under theories he writes"

    
    
        "Maybe for Developers for Debugging? No."
    

There are many examples of things like this in his writing, where actual facts
are unsaid in order to gain the maximum melodrama for a particular statement.

On top of that he seems to continually avoid the point that to enable these
you need physical access to the device (for the pairing process to have a
machine marked as trusted). If you have physical access, enabling debug[2]
features are probably the least of your worries.

Anyway, rant over. It just annoys me that genuinely interesting information
often seems to be spun by personalities to give it artificial gloss these
days, making it all feel a bit slimy and self-serving.

[1]
[https://pentest.com/ios_backdoors_attack_points_surveillance...](https://pentest.com/ios_backdoors_attack_points_surveillance_mechanisms.pdf)

[2] Debug if you're Apple, Back Doors if you're Mr. Zdziarski

~~~
api
You just described significant portions of the security industry, which runs
on maximizing the fear and FUD factor.

It's not just true of computer security. It's really true globally of the
entire "security" sector, from infosec to police to the global "national
security" defense/intelligence industry and so forth. Step 1: frighten, step
2: sell protection, step 3: profit.

Not saying there aren't risks out there, just that the industry markets itself
through bombast and sometimes exaggerates them.

Back to the infosec realm, the simple truth is that the only absolutely secure
system is one that is off and the only absolute privacy is in your own head
(maybe). Everything else is a matter of degrees of risk, and the curve is
hockey stick shaped. It's relatively easy to mitigate the big risks, but that
leaves a long tail of small risks and small vulnerabilities that require an
exponentially increasing amount of effort and inconvenience to deal with.

~~~
rqebmm
every industry trumps up the usefulness of their product, it's called
marketing. It's on the consumer to cut through the marketing-speak and
understand what they actually need to pay for.

~~~
TeMPOraL
> _every industry trumps up the usefulness of their product, it 's called
> marketing._

It should be called lying and bullshitting, and I strongly believe that we
tolerate it far too much as a culture.

~~~
api
Marketing can degrade into con artistry. It isn't always, but it certainly
can.

That being said, I think in the long run people appreciate it if your
marketing isn't scummy.

(Not directly aimed at the original article, though I do think there's a lot
of deceptive, confusing, and overly hyperbolic FUD in the security sector.)

------
fredsted
I'm a little conflicted about this. On one hand it's good to learn about the
security of your device, on the other hand he's far too partial and
sensationalist about these iOS features. Yes, features.

• It's good to know packet capture can be remotely enabled on your device from
data collected on a computer the device has trusted.

• It's good to know Apple has the power to look through your encrypted files
given physical access (file relay).

• It's good to know one can extract files from his phone using a trusted
computer (house arrest).

However, that's it. There's no "back door". There's no (implied or otherwise)
NSA conspiracy. There's a reason why the media "misunderstood" his talk: it
was full of hyperbole.

~~~
aktau
> It's good to know Apple has the power to look through your encrypted files
> given physical access (file relay).

So the requisites are: "be Apple" and "have physical access"? That's awfully
little for what's supposed to be encrypted files.

It (seems to be) no secret that law enforcement sends devices to Apple when
they can't handle them themselves. So if I'm understanding it right: you can't
protect yourself with an iDevice, consider all your data compromised?

Sensationalist or not, it bothers me a little.

~~~
josho
What is the threat model that concerns you? If you don't trust Apple then
don't use iOS full stop.

I trust Apple engineers, but not Apple Store employees. So, for me the
question to ask is what effort is required to circumvent the system. If any
employee at an Apple Store can circumvent your encryption then that is an
issue. If circumventing requires your passcode, then that's far less
concerning.

~~~
Karunamon
Use your imagination. Apple could be compelled to dump a device you're sending
in for repairs, for example.

A great deal of these (backdoors/vulnerabilities/developer tools) have no
reason to exist in a non-developer-mode consumer device.

------
tlrobinson
This post appears to be gone. Here's Apple's (new) documentation on the
matter:
[http://support.apple.com/kb/HT6331?viewlocale=en_US&locale=e...](http://support.apple.com/kb/HT6331?viewlocale=en_US&locale=en_US)

If Apple is being truthful and transparent, calling this a "backdoor" is a bit
like calling sshd a "backdoor".

~~~
lucisferre
I don't see where they document how you disable or block these.

~~~
tlrobinson
They aren't enabled by default.

 _" requires the user to have unlocked their device and agreed to trust
another computer"_

and

 _" For users who have enabled iTunes Wi-Fi Sync on a trusted computer"_

~~~
shakethemonkey
They absolutely _are_ enabled by default, running as active daemons which
cannot be disabled by the device owner.

The circumstances by which they may be accessed include user agreement to
"trust another computer", but may not be limited to that.

Because DROPOUTJEEP.

------
IBM
>As usual, the media has completely derailed the intention of my talk.

Lol. The connotations in his presentation and his retweeting of all the press
it got were pretty clear. Seems to me like this guy is looking for his next
gig.

------
ryanmarsh
I'm getting 404s for this link and the root.

Cache:
[http://webcache.googleusercontent.com/search?q=cache:www.zdz...](http://webcache.googleusercontent.com/search?q=cache:www.zdziarski.com/blog/?p=3466)

~~~
deathanatos
Ah, thank you. It was a 500 earlier, and now it just says,

> Checking your browser before accessing zdziarski.com. > This process is
> automatic. Your browser will redirect to your requested content shortly. >
> Please allow up to 5 seconds… > DDoS protection by CloudFlare

And then refreshes indefinitely.

------
coreymgilmore
So in short: Apple has back doors that they claim aren't really back doors
since only Apple apps can use them. If the NSA hasn't been using them already,
it is only a matter of time.

~~~
Alupis
If it's a backdoor for Apple, then it's a backdoor for anyone who can figure
it out (other apps, hackers, government agencies alike).

~~~
snowwrestler
As I understand it, there is more than just figuring out how it works; one
also needs to have physical access to the phone and be able to imitate Apple
cryptographically. Not out of reach for the NSA maybe, but not exactly typical
hacker stuff.

~~~
Alupis
> and be able to imitate Apple cryptographically

Like impersonate them cryptographically with a forged/stolen ssl cert?[1]

> one also needs to have physical access to the phone

The user only has to pair the device with their PC for the PC to become a
"trusted device", from which this exploit can be run.

[1] > Serious Security: Google finds fake but trusted SSL certificates for its
domains, made in France [http://nakedsecurity.sophos.com/2013/12/09/serious-
security-...](http://nakedsecurity.sophos.com/2013/12/09/serious-security-
google-finds-fake-but-trusted-ssl-certificates-for-its-domains-made-in-
france/)

~~~
chc
That is about the browser CA system, which is a mess of questionable trust.
It's not an attack that is applicable to Apple's own root certificate.

~~~
Alupis
> That is about the browser CA system

No, SSL certs are using to sign packages and software too. And Apple would not
have a root cert, their cert would be signed by a root CA, which could be used
to sign other certs if it's tricked into thinking its' Apple requesting them
(like in the recent Google cert example).

So one could impersonate a company if they have a cert that says they are that
company.

~~~
ZoFreX
No, it's about the browser CA system. I don't know exactly how Apple
implemented their signing for iDevices, but it's a reasonable assumption that
the certs need to be signed by Apple, and they didn't effectively hand the
keys over to every registrar in the world.

~~~
Alupis
[http://en.wikipedia.org/wiki/Code_signing](http://en.wikipedia.org/wiki/Code_signing)

They are likely using a plain-old SSL Cert signed by a plain-old public CA,
which is how your computer would know if the executable appears to come from
Apple or not.

~~~
nknighthb
First of all, code signing certificates are _not_ "plain-old SSL certs".
They're for code signing, not SSL.

Second, Apple includes their own root certificates in their own operating
systems just like everybody else. I've personally implemented a code signing
mechanism for a platform that had _no_ root certificates except for those I
_personally_ generated (and still control).

The public CA system is just irrelevant here. It has nothing to do with
anything.

------
bronson
cached copy:
[http://webcache.googleusercontent.com/search?q=cache:h5PtNdY...](http://webcache.googleusercontent.com/search?q=cache:h5PtNdYYHskJ:www.zdziarski.com/blog/%3Fp%3D3466+&cd=2&hl=en&ct=clnk&gl=us)

------
DickingAround
This sounds a bit like the same sort of customer-experience related hacks that
MSFT used to (maybe still does) put in all their software and that caused so
many holes in security. Poor attention to security won't just let US
government intrusion, it'll also let in other governments and hackers.
Seriously, letting a 'trusted computer' enable that data syncing? They're
playing with fire. (feel free to let me know if I'm missing anything here)

~~~
Spooky23
That's an easy thing to say. But on the same token, if Apple required
individual authentication to access the filesystem remotely, critics would
scream about the privacy issues associated with linking a file transfer to a
human identity.

The process of establishing "trust" between computer and iOS device probably
needs a little work, but the concept itself isn't inherently insecure.

Why isn't the security press screaming about the scary inclusion of a massive
black hole of security risk on OS X.... OpenSSH? All I need to do get physical
access, click a checkbox in a preference pane and copy a public key to a
user's home directory, and "poof" I own the box!

------
ddp
No, what's really disappointing is FUD directed at debugging tools. These
sorts of "presentations" and "research" are pointless. Anyone who does IOS
development knows about these tools, they're not secret. Apple has Tech Notes
and documentation in Xcode on them going back years. Let's please try to focus
our ire where it's needed.

And to whoever said that Google is "very open" about their malicious app
problems, well, gosh, where to start...

Google's Android is the cause of the malicious app problem. By not allowing
users to have fine-grained access control on the various entitlements in
Android, Google is forcing users to adopt an all-or-nothing approach to every
app they download. Don't like that this app wants access to your Contacts?
Fine, then don't install it. The root problem here is not allowing the user to
determine, after-the-fact, what privileges an app should have. Apple gets this
right, Google fails miserably.

Of course there's also no one Android. You know that, right? There's a bunch
of different Androids from a bunch of different carriers all of which run
different hacked-up versions littered with a bunch of crap code from carriers
that almost no one wants. Code which I imagine is also littered with security
bugs because it's written by carriers who barely give a damn if this junk even
works and wouldn't know "secure" if it hit them in the head.

And on top of all that, depending on your phone and depending on your carrier,
that brand new phone you just bought might even be running an Android that's
years out of date and full of known vulnerabilities. There's no comparison
when it comes to timely IOS security updates and Android. The Android
ecosystem is a complete fail on the security front at the moment. Period.

Google can play dumb if they want. Plausible deniability is oftentimes quite
useful after all...

~~~
happyscrappy
These type of stories are up voted without being read because, despite
intimations that iOS users are hipsters, the real hipsters are people who use
Android because of some imaginary freedom. Most Android users don't care about
fake hipster stances and just want a cheap phone. They are not willing to pay
for security and they have none as any moderately talented hacker can own an
Android easily even if the user only uses apps from Google's walled garden due
to carrier foolishness. Pretty sad to see HN so taken in by this nonsense.

~~~
rahimnathwani
How could a cracker own my Android? I bought an unsubsidised Moto E. It has no
carrier bloatware, and prompts me when there is a new version of firmware
available. If it's not rooted, and I only install apps from Google Play, what
are the known attack vectors?

~~~
djrogers
The answer is in the question: [http://www.pcworld.com/article/2099421/report-
malwareinfecte...](http://www.pcworld.com/article/2099421/report-
malwareinfected-android-apps-spike-in-the-google-play-store.html)

~~~
rahimnathwani
The definition of malware in that article is quite broad, and includes apps
which are just 'collecting and sending GPS coordinates'.

Even if one of the apps I've installed from Google Play is malware (by the
definition in the article) it doesn't mean my phone is 'owned'. An attacker
can't run arbitrary code on the device or get copies of my data or send texts
pretending to be me.

------
xenadu02
Seems like iOS 8 should offer a settings screen to allow you to revoke sync
keys and/or see a list of computers you've trusted in the past. Perhaps it
should default to deleting the keys if you haven't sync'd with a specific
computer in some timeout period (30 days?).

A few of the services should be locked down a bit further regardless of
anything else.

I also don't see this as a valid bypass of encrypted files - you need the
device to be on and have its passcode entered. That's a far cry from taking a
cold device, booting it, then connecting with a stolen sync key. Besides the
fact that we've known you were unsafe if the device was unlocked for some time
- some police even carry Faraday bags and portable chargers to keep them
accessible probably for this very reason.

------
jradd
I remember when it was trivial to examine artifacts from itunes backup until
backup encryption was implemented with passphrase. (v6 I think?)

Something that still has the capabillity to bypass backup encryption sounds
incredibly dangerous from my perspective.

There are plenty of legitimate concerns mentioned in his talk. I agree with
the no cause for panic, but what about the fact that there are obviously
services not disclosed to us, developers, users, enterprise executives relying
in this for a trusted platform, etc…

The potential risk this poses (or implies) makes the lack of initial
disclosure to be criminally ignorant at least. If Apple wants to balance the
scale, they will need to do more than address and resolve these issues. They
need to extend their transparency a smidgen. :)

------
peterwwillis
Resolving the hyperbole debate: asking a user "May I connect to some device?",
then installing permanent remote access to the device, and never prompting the
user again nor giving them further information, is a plain and simple
backdoor.

The difference between this and malware is malware authors create web pages
explaining to users to "Just click OK and don't ask what this is" before they
deliver you a backdoored application.

If the prompt said "May we install remote access tools that allow us to
remotely control and remove data from your device forever?", then it wouldn't
be a backdoor. It would be a front door.

------
ehPReth
I haven't had much luck finding a video of the talk, has anyone else?

~~~
fredsted
No, but there's talk of that it will be out soon.

------
Gavin321
syncing iPhone to new computer [http://www.leawo.org/tutorial/how-to-sync-
iphone-to-new-comp...](http://www.leawo.org/tutorial/how-to-sync-iphone-to-
new-computer.html)

------
zoom
Later Apple.

 _drops mic_

------
freeslugs
Not Found

The requested URL /blog/ was not found on this server.

Additionally, a 404 Not Found error was encountered while trying to use an
ErrorDocument to handle the request.

------
lnanek2
Kind of glad Apple just confirmed the services are there and ignored him
otherwise. I'm sure he has a nice career ahead of him of complaining the cp
command in the adb shell on Android isn't hard coded to ignore any path with
DCIM (user pictures) in it next and other nonsense. Honestly, he isn't helping
anything and he is just making it harder for Apple to fix broken phones and
provide better customer service in general.

Wonder what he thinks of amazon MayDay showing your screen to custom support
remotely. Users love it since the custom support can now guide you to exactly
the right settings and other things, but I think privacy nuts like this will
have seizures.

------
svgm
Most companies will downplay any negative aspect of their product; it's pretty
normal, part of the survival aspect of an organization. Microsoft has done the
same thing a few times as well. [http://www.zdnet.com/blog/security/microsoft-
downplays-bitlo...](http://www.zdnet.com/blog/security/microsoft-downplays-
bitlocker-password-leakage/1841)
[http://www.computerworld.com/s/article/9133248/Microsoft_con...](http://www.computerworld.com/s/article/9133248/Microsoft_confirms_serious_IIS_bug_downplays_threat?intsrc=news_ts_head)

I'm more surprised at the fact that Apple decided to actually confirm the
existence of a back door in their product (even though they are "misleading"
(as stated in the article) about what really is at risk here). The fact that
Apple was downplaying this tells me they haven't realized that a product,
especially operating systems and computers, depends a lot on the userbase; if
the userbase is kept ignorant then Apple will keep itself in its 'comfortable
zone' since its not being pushed by the users to improve.

Nonetheless, its still pretty good that Apple has confirmed this, baby steps I
guess.

