Largely meaningless from a security standpoint (easily bypassable), and it enables someone to build a large database of fingerprint hashes. (Don't give me any of that "it doesn't upload the actual fingerprints" - uploading hashes is just as useful to security services)
"disk encryption"
You trust Apple way, way too much.
"The CIA and FBI have gone on record as saying as how frustrated they are with Apple's security."
Which is exactly what they would say, wouldn't they?
"Do you really think iPhones can be used as spying devices?"
> [Touch ID] enables someone to build a large database of fingerprint hashes. (Don't give me any of that "it doesn't upload the actual fingerprints" - uploading hashes is just as useful to security services)
This is speaking from a position of ignorance. Apple has been leading the way with responsible fingerprint management, in contrast to nearly every other fingerprint-capable device manufacturer out there (e.g. see HTC's recent debacle). Presupposing good faith on their part, the stated design of Touch ID is to prevent even the fingerprint hash from being accessible to the OS itself, let alone leave the device. All the fingerprint logic and stored hashes runs on a logically distinct "secure enclave" and it checks your fingerprints and just returns "match" or "no match" back to the OS (as I understand it).
If anything, this kind of set-up is what gives me some faith that Apple might do self-driving cars correctly, rather than other manufacturers who think it's somehow okay that the entertainment system runs on the same device that can change driving parameters. Apple clearly needs to up their game too, but it's clear that the possibility of hacking cars is a Bad Thing, and as cars get more networked and more smart it seems clear that some car manufacturers do not seem to have the in-house expertise capable of doing it responsibly, so I wouldn't be surprised if a tech company actually ends up doing a better job (whether that's Apple or Google or whoever).
All that you've said rests on this single word - pretty much everything you've described is just taking Apple's word for it. I don't know where you get this faith from. Yes, they say they don't have any way into this "Secure Enclave" from the OS.
We're living in a world where instructions passing through a processor can be sniffed over GSM frequencies from metres away. You're telling me that not a single engineer at Apple, who knows the entire system inside out, can think of any possible way to get data from a device it is physically, electronically connected to and communicates directly with? Not a single obscure "debugging" mode was left in for convenience?
I don't disagree with you (hence the upvote) but let's put this into context: what are we worried about here? My fingerprints are easily obtainable by a determined attacker. No question at all; I leave a trail of them around me everywhere I go. What you mentioned being worried about is mass mining of fingerprints–that hashes are being uploaded to some big database. The design of the system is to prevent that. We'd know if Apple were wholesale uploading fingerprints to its servers. There is no way that someone would not have noticed that happening.
Note that the secure enclave is a separate co-processor with its own memory. It's secure even if the entire OS kernel is compromised.
> Each Secure Enclave is provisioned during fabrication with its own UID (Unique ID) that is not accessible to other parts of the system and is not known to Apple. [my emphasis] When the device starts up, an ephemeral key is created, entangled with its UID, and used to encrypt the Secure Enclave’s portion of the device’s memory space.
Additionally, data that is saved to the file system by the Secure Enclave is encrypted with a key entangled with the UID and an anti-replay counter.
Note that for the 'rogue engineer' attack, you have to assume that other people at Apple are either colluding with this person or incompetent. Which is not impossible, but it's a little more than just blind faith. Note that the secure enclave has existed for over two years now, since the A7 chip was introduced, and we're yet to see an attack, and I'm sure it's not for lack of trying. If there's a backdoor they're hiding it well, and what would they get out of it anyway?
Are the things you mentioned perfect security? No, but Apple is operating in the land of midgets and is one of the taller players right now.
You're also making a lot of conjectures without any evidence backing them up. The iPhone is one of the most examined pieces of hardware/software out there. If hashes are being uploaded to some master database there would be article after article screaming about it.
The only thing you're close with is about whether a mobile device (including an iPhone) can be used for spying - baseband software in the cellular modem is a known area of concern for any mobile device.
The rest you're making claims that either directly contradict what is known or have no basis, without any evidence.
Funny thing, this sort of information doesn't tend to be put into the public domain.
Anybody who knows anything about the relationships massive companies tend to have with the government can see that the situation being exactly as Apple paints it is highly unlikely.
Largely meaningless from a security standpoint (easily bypassable), and it enables someone to build a large database of fingerprint hashes. (Don't give me any of that "it doesn't upload the actual fingerprints" - uploading hashes is just as useful to security services)
"disk encryption"
You trust Apple way, way too much.
"The CIA and FBI have gone on record as saying as how frustrated they are with Apple's security."
Which is exactly what they would say, wouldn't they?
"Do you really think iPhones can be used as spying devices?"
Yes.
In fact I have little doubt.