Hacker News new | past | comments | ask | show | jobs | submit login
iOS: About diagnostic capabilities (support.apple.com)
91 points by comex on July 23, 2014 | hide | past | favorite | 44 comments



I was surprised Zdziarski made such a big deal over the packet capture tool. It's been documented in the referenced developer Q&A and in various blog posts[1] since iOS 5, and I've personally found it very useful for troubleshooting connectivity problems on enterprisey networks.

[1]: http://useyourloaf.com/blog/2012/02/07/remote-packet-capture...


Zdziarski also claimed house_arrest was not used by the dev tools (slide 32), but Xcode allows you to archive and restore your app's entire filesystem. (I believe it uses house_arrest for this.) I tend to use "ifuse" to access this data, but definitely find it an invaluable development resource.

I hadn't seen the packet capture thing before, but after reading the slides a quick google search turned up the link to the apple dev util that you referenced.

I'm also not sure how big of deal the "doesn't require developer mode" thing is, since the issues he raises require pairing records, and I believe you can enable developer mode if you have that much access. I've never had the device ask for confirmation when I turned on developer access.


It's worth reading his response to Apple's latest documentation - http://www.zdziarski.com/blog/?p=3466


So why is it not a developer option then? Why is it continuously on?


An IT tech isn't going to want to enable developer mode on Joe User's iPhone just to find out why it can't get an IP or join the VPN. The device still has to be paired with a Mac (and on iOS 7, unlocked and the Trust button tapped) in order to activate pcap.


" The Pcapd service, for instance, allows people to wirelessly monitor all network traffic traveling into and out of the device, even when it's not running in a special developer or support mode. "

So does this mean a private key ripped off a paired Bluetooth speaker ends up pwning me? If you are taking case of IT, I think a valid hacking scenario also needs to be considered. Furthermore, all the data in available un-encrypted. I don't know how comfortable I would be with that. Also, once trusted means permanently a slave?


This only allows access to the raw packets that are being broadcast over wifi/cell. (It's like tcpdump, if you're familiar with that.) For stuff sent encrypted over the internet (https/imaps/etc), it's pretty much useless. If stuff is being sent unencrypted, there are other means of looking at it anyway.

The "pairing" refers to when you connect via a USB cable and say "trust this computer". (The iOS device must be unlocked.)

An encrypted copy of the some keys are sent to the computer. These allow the iOS device to decrypt data that normally can only be decrypted after the passcode is entered. (Making it possible to back up the device without entering the passcode.) Those encrypted keys can only be decrypted by a trusted computing module on that specific device. So you are kinda screwed if someone has both your laptop and your iphone, and they have Apple-level access to the iphone. I recommend using file vault or other full disk encryption to protect your laptop.


The Escrow Keybag is described in the iOS Security Guide, page 14:

http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...


Yeah, and I think the version of the keys on your laptop and desktop can be copied and used to access your iOS device at a later date if someone gains brief access to any computer paired to it. GCHQ and the NSA apparently have tools to take advantage of this.


Bluetooth pairing is totally unrelated.


It's too much of a security liability for most of Apple's market. It shouldn't be there. Some slight convenience for 5 percent of Apple's market isn't a good trade-off.


I am surprised that users like you simply don't care that Apple makes it easy for security authorities and criminals alike to access your iOS devices. Thanks to Snowden, we should know that we cannot trust any IT companies, and we should also have learnt to understand the meaning of overspecific denials.


That sounds scary. Could you clarify: how can criminals or security authorities use this to access my device?


Please allow me to refer to http://www.zdziarski.com/blog/?p=3466, Jonathan Zdziarski is far more qualified than I am to provide further information.


They would need physical access to the device and know your unlock code.


Wait, if they have that, they already have everything


Having an official explanation what the processes do is very welcome, but this still leaves out the question whether there is a way to access these daemons without prior user approval.

And if there is, the question is, who has access to that method and how well that access is protected (from rogue employees with access to keys for example)

At this point, I would still consider all data on my phone to be accessible to law enforcement and criminals (assuming they have stolen the keys) provided they have physical access to the device.

I'm basing the data stored on the device on that assumption and, for example, keep ssh keys separately encrypted and don't store the pass phrase.


If you don't use a complex passcode, then they do have access to all your files/mail/contacts once they have physical access to your device. Apple has always made that abundantly clear, and it's how they provide Law Enforcement access to your files -it's trivial to brute force a 4-digit password once they've remote mounted your file system (which they can do once they have physical access to your device)

The great security concept behind TouchID isn't so much that it's using biometric login, but that it makes using a secure passcode viable. Nobody is willing to type in a 15 character password every time they unlock their iPhone - but most security people are willing to do it during bootup, and then use touch ID to submit their passcode the rest of the time.


> The great security concept behind TouchID isn't so much that it's using biometric login, but that it makes using a secure passcode viable

yes. Which is I'm a very happy user of TouchID. My passphrase is 25 characters long, containing letters, numbers and even symbols.

I furthermore hope that I'll be able to recognize threats early which will allow me to shut off the phone quickly enough, forcing a passcode entry.

That's beside my point though: If law enforcement and/or criminals get to use these diagnostic services without my passcode being entered, then all of this is moot because they would just access the data and not bother with the passphrase to begin with.

The article in apples knowledge base does neither confirm nor deny that there is access to these diagnostic services that doesn't require the phone to be unlocked, so I assume that there is a backdoor and I have planned my usage accordingly.


The devices have to be paired though (with the trusted computer thing), and to complete that pairing process you need to unlock your phone with the passcode, if it is locked. If it's unlocked you just have to poke a dialog button, but if it's unlocked you're screwed anyway :)


Yes. That's the official method.

What I'm concerned about is the backdoor that allows pairing without previously unlocking the device. Apple does neither confirm nor deny it exists, so we have to assume it exists.

And when it exists, it could have been given to various law enforcement agencies or stolen by criminals which now use it for identity theft.

Which is why I prefer my hardware not to be backdoored.


If you want an intermediate level of security, you can use a longer, numerical passcode, and the device will give you a number pad instead of a keyboard to type it in.

Also brute-forcing must be done on the device because the UID key is involved, so you either need to jailbreak without erasing stuff or have Apple's keys (which allow you to jailbreak without erasing stuff).


Yes - at 5 passwords/second (http://www.elcomsoft.com/ios_forensic_toolkit_faq.html) a six digit password would take about 3 days to recover, 7 digits would be 23 days, and eight digits at 3/4 of a year would mean you would have to be a very interesting target.


I use 9 digits, as well as being significantly "secure enough" it's also a conveniently easy length to remember, as three groups of three digits - nnn nnn nnn


This is almost certainly a direct response to this presentation: https://news.ycombinator.com/item?id=8057470


Wow - this is something Apple would have never done in the past. They're really pushing to make the point that they don't have any backdoors/participate with government agencies. Interesting turn, where Apple would have just stayed silent previously.


Apple did something similar in February when they published a pretty extensive document on how security works in iOS to explain how securely iMessage, TouchID, code signing, etc. are setup. Apple takes security quite seriously and they seem to want to ensure everyone knows it.

http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...

Apple also seems to be opening up a bit. The lack of secrecy gag-order on most of the WWDC information was very surprising.

The pessimistic view (which I'm sure has factored in) is that they're only publishing such things to poke at Google.

Google can try to post some similar information but in most cases they don't control the hardware so they can't verify as much of the chain. On the other hand if Google is providing access to someone they may not be able to publish such detailed information and Apple can play the "Where's your document" game.


"The pessimistic view (which I'm sure has factored in) is that they're only publishing such things to poke at Google."

Why do you consider this pessimistic? I, for one, would love to live in a world where all the smart phone vendors started aggressively competing on how they each protected my privacy and security.


The public reaction is necessary for the business model of Apple's (and Google's) mobile platforms. They have to prevent users from thinking too long or hard about the level of trust they're placing in them. Thus quickly minimizing the findings like in this case.

The balancing act they've chosen for themselves is to keep the masses of technical users confident enough in the products to use them, but dienganged enough that they avoid really thinking about the security model of the device they've embedded their life into.

At some point, it doesn't even matter if the unmodifiable security decisions they make are right or wrong for a user that "owns" a device but not the signing keys to change the software on it. We all see it as something along the lines of "this design could go real bad, but probably isn't." That position relies on masses of consumer confidence, good will and raising the cost of alternatives. They have to nip this in the bud, because even though there is unease at the fringes, it's not large enough to rock the boat. For now.


agreed, i think this is more of a tim cook approach, in recent times he has tried to send out a clear message when it comes to things that apple feels very strongly about, some of them may involve their products and some of them may not for ex. 1. the emphasis on caring for the environment 2. support for the LGBT community 3. the focus on user privacy... and so in this case they seem to be pushing forth the message that user privacy is very important to them


How is this 'un-Apple'? Apple did it, so by definition it is very 'Apple'.


Traditionally Apple has been a very secretive company and wouldn't publish information that wasn't strictly necessary to many people.

They've been easing off that recently (which is a serious plus), although I'm not sure this information would have been censored before, maybe just behind the developer paywall where most people couldn't easily see it.


So pcapd on iOS is supposed to allow you to capture packets from a trusted computer: http://support.apple.com/kb/HT6331?viewlocale=en_US&locale=e...

  pcapd supports diagnostic packet capture from an iOS device to a trusted computer. This is useful for troubleshooting and diagnosing issues with apps on the device as well as enterprise VPN connections. You can find more information at developer.apple.com/library/ios/qa/qa1176.
If you actually follow that link, you come out to a page detailing how to do packet captures for different Apple devices, including iOS:

  iOS does not support packet tracing directly. However, if you're developing for iOS you can take a packet trace of your app in a number of different ways:

  If the problem you're trying to debug occurs on Wi-Fi, you can put your iOS device on a test Wi-Fi network. See Wi-Fi Capture for details.
  If your app uses HTTP, you can configure your iOS device to use a debugging HTTP proxy (such as Charles HTTP Proxy).
  In iOS 5 and later you can use the remote virtual interface facility.
There does not seem to be a mention of that pcapd capability in there...


The "pcapd capability" is iOS's remote virtual interface.

https://developer.apple.com/library/Mac/qa/qa1176/_index.htm...

  OS 5 added a remote virtual interface (RVI) facility that lets
  you use OS X packet trace programs to capture traces from an
  iOS device. The basic strategy is:
  
  1. Connect your iOS device to your Mac via USB.
  
  2. Set up an RVI for that device. This creates a virtual
  network interface on your Mac that represents the iOS device's
  networking stack.
  
  3. Run your OS X packet trace program, and point it at the RVI
  created in the previous step.

If you don't pair the iOS device with the Mac, the technique won't work.


It's pretty amazing how much press this guy got for nothing.


He raises several interesting point and apple addresses only a few of them.


Is there a way to use the file_relay capability? Documents?


libimobiledevice has support for it (see filerelaytest.c):

http://www.libimobiledevice.org/

edit: com.apple.mobile.house_arrest also exposes much more than you see in iTunes; there are various applications to use it, such as iExplorer.


It seems that you can't put existing devices into supervision without losing all of the data on them. Which really sucks


So, if they're there for diagnostics why aren't they disabled by default, requiring user intervention to enable them?


Unless you plug the device in, unlock it using your pin code/touch ID sensor, and say "yes, trust the computer I am connected to" these won't be accessible.


According to the security researchers, these services are available over TCP and existing pairing requests & responses can be replayed to access them.


This is covered on slide 46 of the original researchers presentation.

"Oh, and… there’s a bypass switch for pairing anyway"

--- Next Slide ---

"An electronic alternative to interdiction could be deployed by spoofing Apple’s certificates and configuring / pairing the device out of the box."

 "OR by penetrating a targeted organization, supervisor records can be used to pair with and access any device they’re supervising."


All that's saying is that they can use whatever mechanism is available to unlock the device. In an enterprise environment, the supervisor has the credentials to unlock the device.

> "Oh, and… there’s a bypass switch for pairing anyway"

Nice conjecture. Please show us actual evidence of that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: