

iOS: About diagnostic capabilities - comex
http://support.apple.com/kb/HT6331?viewlocale=en_US&locale=en_US

======
jatoben
I was surprised Zdziarski made such a big deal over the packet capture tool.
It's been documented in the referenced developer Q&A and in various blog
posts[1] since iOS 5, and I've personally found it very useful for
troubleshooting connectivity problems on enterprisey networks.

[1]: [http://useyourloaf.com/blog/2012/02/07/remote-packet-
capture...](http://useyourloaf.com/blog/2012/02/07/remote-packet-capture-for-
ios-devices.html)

~~~
iamshs
So why is it not a developer option then? Why is it continuously on?

~~~
jatoben
An IT tech isn't going to want to enable developer mode on Joe User's iPhone
just to find out why it can't get an IP or join the VPN. The device still has
to be paired with a Mac (and on iOS 7, unlocked and the Trust button tapped)
in order to activate pcap.

~~~
iamshs
" The Pcapd service, for instance, allows people to wirelessly monitor all
network traffic traveling into and out of the device, even when it's not
running in a special developer or support mode. "

So does this mean a private key ripped off a paired Bluetooth speaker ends up
pwning me? If you are taking case of IT, I think a valid hacking scenario also
needs to be considered. Furthermore, all the data in available un-encrypted. I
don't know how comfortable I would be with that. Also, once trusted means
permanently a slave?

~~~
dunham
This only allows access to the raw packets that are being broadcast over
wifi/cell. (It's like tcpdump, if you're familiar with that.) For stuff sent
encrypted over the internet (https/imaps/etc), it's pretty much useless. If
stuff is being sent unencrypted, there are other means of looking at it
anyway.

The "pairing" refers to when you connect via a USB cable and say "trust this
computer". (The iOS device must be unlocked.)

An encrypted copy of the some keys are sent to the computer. These allow the
iOS device to decrypt data that normally can only be decrypted after the
passcode is entered. (Making it possible to back up the device without
entering the passcode.) Those encrypted keys can only be decrypted by a
trusted computing module on that specific device. So you are kinda screwed if
someone has both your laptop and your iphone, and they have Apple-level access
to the iphone. I recommend using file vault or other full disk encryption to
protect your laptop.

~~~
jatoben
The Escrow Keybag is described in the iOS Security Guide, page 14:

[http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...](http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.pdf)

------
pilif
Having an official explanation what the processes do is very welcome, but this
still leaves out the question whether there is a way to access these daemons
without prior user approval.

And if there is, the question is, who has access to that method and how well
that access is protected (from rogue employees with access to keys for
example)

At this point, I would still consider all data on my phone to be accessible to
law enforcement and criminals (assuming they have stolen the keys) provided
they have physical access to the device.

I'm basing the data stored on the device on that assumption and, for example,
keep ssh keys separately encrypted and don't store the pass phrase.

~~~
ghshephard
If you don't use a complex passcode, then they do have access to all your
files/mail/contacts once they have physical access to your device. Apple has
always made that abundantly clear, and it's how they provide Law Enforcement
access to your files -it's trivial to brute force a 4-digit password once
they've remote mounted your file system (which they can do once they have
physical access to your device)

The great security concept behind TouchID isn't so much that it's using
biometric login, but that it makes using a secure passcode viable. Nobody is
willing to type in a 15 character password every time they unlock their iPhone
- but most security people are willing to do it during bootup, and then use
touch ID to submit their passcode the rest of the time.

~~~
pilif
_> The great security concept behind TouchID isn't so much that it's using
biometric login, but that it makes using a secure passcode viable_

yes. Which is I'm a very happy user of TouchID. My passphrase is 25 characters
long, containing letters, numbers and even symbols.

I furthermore hope that I'll be able to recognize threats early which will
allow me to shut off the phone quickly enough, forcing a passcode entry.

That's beside my point though: If law enforcement and/or criminals get to use
these diagnostic services without my passcode being entered, then all of this
is moot because they would just access the data and not bother with the
passphrase to begin with.

The article in apples knowledge base does neither confirm nor deny that there
is access to these diagnostic services that doesn't require the phone to be
unlocked, so I assume that there is a backdoor and I have planned my usage
accordingly.

~~~
mnem
The devices have to be paired though (with the trusted computer thing), and to
complete that pairing process you need to unlock your phone with the passcode,
if it is locked. If it's unlocked you just have to poke a dialog button, but
if it's unlocked you're screwed anyway :)

~~~
pilif
Yes. That's the _official_ method.

What I'm concerned about is the backdoor that allows pairing without
previously unlocking the device. Apple does neither confirm nor deny it
exists, so we have to assume it exists.

And when it exists, it could have been given to various law enforcement
agencies or stolen by criminals which now use it for identity theft.

Which is why I prefer my hardware not to be backdoored.

------
Titanous
This is almost certainly a direct response to this presentation:
[https://news.ycombinator.com/item?id=8057470](https://news.ycombinator.com/item?id=8057470)

------
owenwil
Wow - this is something Apple would have never done in the past. They're
really pushing to make the point that they don't have any
backdoors/participate with government agencies. Interesting turn, where Apple
would have just stayed silent previously.

~~~
MBCook
Apple did something similar in February when they published a pretty extensive
document on how security works in iOS to explain how securely iMessage,
TouchID, code signing, etc. are setup. Apple takes security quite seriously
and they seem to want to ensure everyone knows it.

[http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...](http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.pdf)

Apple also seems to be opening up a bit. The lack of secrecy gag-order on most
of the WWDC information was very surprising.

The pessimistic view (which I'm sure has factored in) is that they're only
publishing such things to poke at Google.

Google can try to post some similar information but in most cases they don't
control the hardware so they can't verify as much of the chain. On the other
hand if Google _is_ providing access to someone they may not be able to
publish such detailed information and Apple can play the "Where's your
document" game.

~~~
ghshephard
"The pessimistic view (which I'm sure has factored in) is that they're only
publishing such things to poke at Google."

Why do you consider this pessimistic? I, for one, would love to live in a
world where all the smart phone vendors started aggressively competing on how
they each protected my privacy and security.

------
jvdh
So pcapd on iOS is supposed to allow you to capture packets from a trusted
computer:
[http://support.apple.com/kb/HT6331?viewlocale=en_US&locale=e...](http://support.apple.com/kb/HT6331?viewlocale=en_US&locale=en_US)

    
    
      pcapd supports diagnostic packet capture from an iOS device to a trusted computer. This is useful for troubleshooting and diagnosing issues with apps on the device as well as enterprise VPN connections. You can find more information at developer.apple.com/library/ios/qa/qa1176.
    

If you actually follow that link, you come out to a page detailing how to do
packet captures for different Apple devices, including iOS:

    
    
      iOS does not support packet tracing directly. However, if you're developing for iOS you can take a packet trace of your app in a number of different ways:
    
      If the problem you're trying to debug occurs on Wi-Fi, you can put your iOS device on a test Wi-Fi network. See Wi-Fi Capture for details.
      If your app uses HTTP, you can configure your iOS device to use a debugging HTTP proxy (such as Charles HTTP Proxy).
      In iOS 5 and later you can use the remote virtual interface facility.
    

There does not seem to be a mention of that pcapd capability in there...

~~~
ryannielsen
The "pcapd capability" is iOS's remote virtual interface.

[https://developer.apple.com/library/Mac/qa/qa1176/_index.htm...](https://developer.apple.com/library/Mac/qa/qa1176/_index.html)

    
    
      OS 5 added a remote virtual interface (RVI) facility that lets
      you use OS X packet trace programs to capture traces from an
      iOS device. The basic strategy is:
      
      1. Connect your iOS device to your Mac via USB.
      
      2. Set up an RVI for that device. This creates a virtual
      network interface on your Mac that represents the iOS device's
      networking stack.
      
      3. Run your OS X packet trace program, and point it at the RVI
      created in the previous step.
    
    

If you don't pair the iOS device with the Mac, the technique won't work.

------
IBM
It's pretty amazing how much press this guy got for nothing.

~~~
Tepix
He raises several interesting point and apple addresses only a few of them.

------
Khaine
It seems that you can't put existing devices into supervision without losing
all of the data on them. Which really sucks

------
iancarroll
Is there a way to use the file_relay capability? Documents?

~~~
comex
libimobiledevice has support for it (see filerelaytest.c):

[http://www.libimobiledevice.org/](http://www.libimobiledevice.org/)

edit: com.apple.mobile.house_arrest also exposes much more than you see in
iTunes; there are various applications to use it, such as iExplorer.

------
X-Cubed
So, if they're there for diagnostics why aren't they disabled by default,
requiring user intervention to enable them?

~~~
X-Istence
Unless you plug the device in, unlock it using your pin code/touch ID sensor,
and say "yes, trust the computer I am connected to" these won't be accessible.

~~~
robszumski
This is covered on slide 46 of the original researchers presentation.

"Oh, and… there’s a bypass switch for pairing anyway"

\--- Next Slide ---

"An electronic alternative to interdiction could be deployed by spoofing
Apple’s certificates and configuring / pairing the device out of the box."

"OR by penetrating a targeted organization, supervisor records can be used to
pair with and access any device they’re supervising. "

~~~
lyinsteve
All that's saying is that they can use whatever mechanism is available to
unlock the device. In an enterprise environment, the supervisor has the
credentials to unlock the device.

> "Oh, and… there’s a bypass switch for pairing anyway"

Nice conjecture. Please show us actual evidence of that.

