Hacker News new | past | comments | ask | show | jobs | submit login
Apple Disables FaceID if you change the screen yourself [video] (youtube.com)
180 points by fortran77 69 days ago | hide | past | favorite | 112 comments



Everyone commenting with "security" is missing the point (and hasn't watched the first minute of the video). In previous versions of the iPhone the screen was not part of the FaceID system and could be independently replaced. The sensor/cable/motherboard were all still cryptographically paired and secure. This is a brand new limitation in iPhone 13.


"Security" is an easy cop-out to justify lock-in. That people that call themselves technically minded still fall for it is a miracle.

What security scenario would you even use as a basis here? Ridiculous...

Totally not resales negatively affecting Apples turnover.


Those OEM parts can either come from broken devices... or they can come from stolen ones. Not that I agree with Apple's restrictions, but from a theft victim's point of view, would they like having the thief make good money from the phone?


Wow, sounds like we should cryptographically sign all replacement parts for anything, they could come from a stolen product after all!

That aftermarket car bumper? Sorry, can't use it, might come from a stolen vehicle.


Car makers actually have the same thing. Not on bumpers, but on ignition keys and ECU (and possibly more). You can't swap out the ECU without getting keys reprogrammed by the dealer.


This isn't entirely true. For example, I can purchase software to program keys in Fords.

Specialized smaller shops also have the ability.

Currently there is NO WAY for me to change a screen on apple device without an annoying message or faceId issues on the 13. I cannot purchase a programming software.


> without getting keys reprogrammed by the dealer.

Or some guy that figured it out with or without insider knowledge (which also usually means anyone can make a key for your car).


I was both insanely impressed and woefully dismayed that the key guy had to hack my car to sell me a key.


To make a key for your car they'd need to break into your car first, so...


Twin that with repeater attacks from people that leave their keys close to their door (like many people do) to unlock by proximity and that’s how a lot of newer cars are stolen.


I can get my key reprogrammed at Walmart. What do you mean?


You can chose not buying an iPhone.


My phone got stolen once and I still like the option to repair my device. Weak hypothetical.


This can be achieved by signing the parts and once a phone is reported as stolen, the respective parts get disabled.

Trivial solution for someone like apple.


Since when is it anyone's business (within the limits of the law) where I get my parts from?


This does nothing to target businesses that make money via theft. If I steal iPhones and sell them how they come I still make money, if I repair iPhones with stolen parts I also still make money and have the exact same extreme limitations as my legitimate counterparts.


If the parts are useless to be used, who would buy those stolen iphones?


If I'm selling unmodified stolen iPhones then they wouldn't be useless.


But how would your customers activate said stolen iPhones?


I'm sympathetic to the guy, but I still don't really see this as a big deal. There's not a ton of space in a modern smartphone, so it seems entirely plausible that some legitimate design change or manufacturing difficulty caused the signed components to get moved to the screen. (He says it's not purely a new limitation, which would be more questionable - the cable which was previously cryptographically paired now is not.)


If Apple had the user's best interests in mind they could implement a system where if the device detects any component has been tampered with, the user will be alerted during the first boot up and it will require the user to manually activate the new touch id sensor. That way if you get that message without knowledge of your display being replaced you could be wary of someone trying to break into your phone without sacrificing repairability.


Funnily enough if your front camera stops working your whole phone stops working with the current setup.

I went to an iPhone repair shop because my phone was boot-cycling and the repair guy basically said “yep, that’s a faulty front facing camera, stops it booting”. Then he unplugged it and the phone was able to turn on. At this stage he could even plug the camera back in and it worked… so I’m assuming it was actually an issue with the faceID signing process at boot that bricked my iPhone. The repair guy said it was really common.

He could replace everything except Face ID on the device which was a big shame as it killed the resale value (he could replace the earpiece/camera module with one that didn’t have faceID which is presumably to avoid issues with signing).


Permanently disabling FaceID is a feature, not a bug. FaceID (and the inability to really turn it off) are the reason that I will leave Apple behind when I finally have to replace my current pre-FaceID iPhone.


What do you mean the inability to really turn it off?

I don't even have my face registered and just use a passcode for everything.


Seems like a tiny chip that delayed power (or held the camera’s RESET line high) for a few seconds after boot would solve this.

Maybe someone will make a flex cable that has it all integrated.


Sorry, I mean it all worked except for FaceID which remained non-working.

So the phone boot cycles if the cable is in, but if you take it out and then insert it back during boot the front camera and front speaker work but not FaceID.


If someone's able to do some kind of physical attack that involves a cable swap style mitm to mess with the face id system, what stops them from also rebooting your phone once?


You can do it so that only after successful unlock it stops showing the message. This is trivial.


Maybe but why can't the end user simply be tasked with accepting the new screen signature? That's how GPG, Secure boot, etc. all work. You just have a persistent list of trusted signatures.


Because not allowing customers the opportunity for competence or independence increases the amount of money Apple can extract. The walls in the garden only ever get higher.


Majority of users are not in the position to understand what a signature is and whether they should accept it or not.

So everyone will just click "Approve" in order to get their phone back. In which case what is the point of even having trusted signatures at all ?


"We detect that your phone screen has just been changed. Is this expected ?" with a "yes", "no" buttons for answers.

Pretty sure this is understandable.


How do you deal with a rouge parts supplier, phone store, phone store employee, etc?


I know this joke! Stop telling them embarrassing stories? I’m sure you meant rogue though.

Yes, in theory it might be a problem. In practice any supply chain that resulted in a significant issue like this would be easily prosecuted so let wetware systems handle the things they’re better suited to.


So what happens if you click “no”?


"It is possible someone tampered with your phone to gain unauthorized entry","Please contact apple-authorized support".


If your role is to protect data you could always require an OS reinstall before the new part becomes functional, removing any incentive from replacing parts to gain access to a user's existing data.

But the entire assumption that parts are restricted for security reasons is flawed. "Security" is done in the main SoC, the peripherals such as Face ID/Touch ID sensors should just be dumb input devices providing the face/fingerprint data to the SoC which authenticates them. There is absolutely no reason why those should be part of the chain of trust.


Because now you can defeat the security on a stolen phone by replacing the screen and clicking "yes".


Every biometric system on phones also forces the user to set a password/pin as fallback. So no, you cannot just gain entry into a stolen phone like that.

Any phone that lets you just circumvent security by removing a part wasn't secure in the first place.


It shouldn't be particularly surprising that the two parts are cryptographically paired, the sensors for the IR camera and the HSM within the main processor are extensions of the same security boundary effectively. If you could replace the sensor without something being unhappy, the system would be completely broken.


Just to make it clear that this isn't a purely theoretical concern, it's how Windows Hello got bypassed this summer: https://www.wired.com/story/windows-hello-facial-recognition...


But do they have to pair the sensors in each phone distinctly? I assume they have hardware security measures that you cannot just read out the keys from one of the sensors. So why not just put the same "key" in all of the official IR cameras? Then you could still swap cameras with official safe ones, but you could not build a fake replacement.


The sensors are using unique patterns on each device. There would need to be a pairing between the TrueDepth camera and the Secure Enclave in order to account for this unique pattern. This is why you can't copy Face ID data from one device to another and expect it to work.

From the Apple Platform Security document:

> After the TrueDepth camera confirms the presence of an attentive face, it projects and reads thousands of infrared dots to form a depth map of the face along with a 2D infrared image. This data is used to create a sequence of 2D images and depth maps which are digitally signed and sent to the Secure Enclave. To counter both digital and physical spoofs, the TrueDepth camera randomises the sequence of 2D images and depth map captures and projects a device-specific random pattern. A portion of the Secure Neural Engine — protected within the Secure Enclave — transforms this data into a mathematical representation and compares that representation with the enrolled facial data. This enrolled facial data is itself a mathematical representation of the user’s face captured across a variety of poses.

https://support.apple.com/en-gb/guide/security/sec067eb0c9e/...


So the individual dot patterns are fixed in hardware? That's clever. (A former colleague of mine has a vaguely related patent. They create unique patterns inside of chips by applying (sputtering) material at an angle, which creates chaotic patterns that cannot be copied by lithography, so the chip is in principle unclonable.)


Apple doesn't have to permanently disable FaceID to maintain security; just require a password to log in after the screen has changed and reinitialize FaceID. And as another comment said "if the device detects any component has been tampered with, the user will be alerted during the first boot up".


I guess they want to prevent someone from intercepting and tampering with a phone while it's being shipped to the customer, you know the supply chain attacks NSA (https://www.theverge.com/2013/12/29/5253226/nsa-cia-fbi-lapt...) and CIA (https://phys.org/news/2017-03-wikileaks-cia-hacks-apple-mac....) perfected.


But there is a way to replace the screen without disabling FaceID (how Apple does it), and the NSA likely knows how to do that as well. Or if they don't, they could force Apple to do so secretly.


You’d run the pairing process, which can only be ran by trusted Apple repairmen with special internal tools, or any of the guys in Shenzhen back alleys with the device called JC Programmer.


Right and that's the issue... why is this needed in the first place? There's no additional security, it just prevents DIY types from repairing their own phone, as they don't want to buy a $500 tool.


The NSA doesn't have Apple's private keys. They aren't a magical organization that can crack all modern crypto either.


Have you heard about FISA courts? They can issue warrants for surveillance and the affected company is not allowed to disclose this. As a HN reader, it is likely you know about it but most people probably don't. Still, if a federal agent comes along with a warrant, they'll have to comply.

My point is, I personally can't be sure there is no mechanism, legal or illegal, via which the NSA can just compel them to hand over the keys. Who knows, maybe there exists a FOOBAR law that compells them to build backdoors into their products. Maybe they're doing it voluntarily outside of the law. I would assume everything can be trivially hacked nowadays, and a large portion of intelligence analysts spend their time on parallel construction (i.e. creating alternative stories about how they gatheried their evidence, without giving away their abilities).


Hypothetical: what good is a court order if the only copy of the key is in a HSM with tamper protection?

I'm not saying that Apple does this. I simply don't know.


>The NSA doesn't have Apple's private keys.

Are you sure? Last I heard the NSA is in the intelligence gathering business... I absolutely would not put it past them that they acquired those keys somehow.

Even if they do not have the keys, bribing an employee and/or installing your own NSA person as an employee in an Apple-authorized repair shop seems like a rather easy thing to do. Or just backdoor the computer that is used to communicate with apple to perform the re-signing stuff. There are thousands of authorized repair shops around the world and you only need to compromise one (or compromise a few so that any "abnormal patterns of repairs" cannot be easily detected by Apple, if it even audits repair logs for suspicious activity at all).

I am sure the NSA has something figured out by now. And not just them.


> Are you sure? Last I heard the NSA is in the intelligence gathering business...

Are you sure there isn't a teapot in orbit around Mars?

The NSA can't magically break into people's HSMs and steal the keys. We have a pretty good idea of what the NSA's capabilities are. If Apple's system is correctly designed, it will only allow components with certificates installed at manufacturing time to be paired together, and the pairing server will use a HSM to hold the private key to do so.

Can the NSA still backdoor your phone? Sure, that's possible by physical definition. They could extract a specific camera unit's private key via FIB analysis, then manufacture a backdoored replacement, insert that key into it, then bribe or otherwise backdoor themselves into some Apple authorized repair shop and run through a re-pairing with it when they swap it into your phone. But that's a lot harder and less likely to happen (i.e. only in extremely high-value cases, if at all) than the silly "oh it's the NSA, they definitely have the keys" nonsense that people throw around here without any supporting evidence.


Yes, I'm sure there isn't a teapot in orbit around Mars. Not because it's impossible (it isn't, we've put other things in orbit around Mars) but because no one has a motive to put a teapot there.

On the other hand, the NSA certainly has a motive to obtain Apple's keys. And while I doubt we really have a good idea of what the NSA's capabilities are, the capabilities we do know about are sufficient.


No, they aren't. The NSA can't just waltz into Apple and break into their HSM to steal their keys. These systems are designed to be robust even against physical access attacks. Even if the NSA had a backdoor in commercial HSMs, they'd have to get to them physically, as these things aren't on the internet directly.

The Snowden leaks gave us a very good idea about what the NSA does. Tap fibre. Backdoor products in transit. Develop and use remote exploits (nothing particularly amazing about them, it's on the same level of RE/exploitation/polishing work as what I've done for game console homebrew in the past). That one time they actually did something novel with modern crypto and figured out how to efficiently break 1024-bit DH for fixed parameter sets. These are all practical, reasonable things that are a far cry from the magical abilities people like to ascribe to them.


>No, they aren't. The NSA can't just waltz into Apple and break into their HSM to steal their keys

The NSA could waltz into the plant manufacturing the HSM. Or they could waltz into Apple and make sure Apple uses the right HSM with keys already known to the NSA, e.g. by replacing the shipment of the HSMs Apple ordered. (This would require the NSA knowing in advance when Apple buys HSMs for what purpose and from what vendor, but again, intelligence gathering business).

>These systems are designed to be robust even against physical access attacks.

They are designed by humans trying to make those things "robust" against attacks. "Robust" is a "best effort" word.

Assuming that the thing that got shipped to you actually is the thing you think and not just some nice NSA chip with the name of a popular HSM vendor printed on top. You cannot really open up the chip and look inside as, if this is a properly designed HSM, this it is supposed to be destro if you tamper with the enclosure or any other vital parts of it.

I'd be surprised if the NSA does not spend resources on research into breaking HSMs, and if it do esnot spend resources on designing "fakes". The level of their success is unknown until the next Snowden shows up. But if they did break some HSMs, they wouldn't be alone [0].

So, there is a chance the NSA can waltz into Apple and break into their HSM and steal their keys.

This assumes Apple uses HSMs for this kind of stuff and uses them correctly.

Since authorized third-party repair shops can replace components and repair the hardware with some help from Apple, as I said before, it's not even necessary to compromise Apple or their HSMs, it's sufficient to compromise those repair shops, whether third party of have a few NSA Geniuses around.

But either swapping out HSMs pre-installation or breaking ones with design flaws is certainly in the realm of possibility too.

>The Snowden leaks gave us a very good idea about what the NSA does.

No, Snowden showed what they did in 2013 and before, because that's when he went public.

Snowden does not know what they did 2014, let alone what they are doing now in 2021. That's about 8 years of most recent NSA R&D that we do not know about.

>Backdoor products in transit.

HSM is a "product", last I heard ;)

>These are all practical, reasonable things that are a far cry from the magical abilities people like to ascribe to them.

Magical? Nothing magical in what I said I believe the NSA could reasonably achieve.

We're talking about abusing design flaws in HSMs and/or supply chain attacks not creating "a GUI interface using VisualBasic to track the killers IP address".

[0] https://cryptosense.com/blog/how-ledger-hacked-an-hsm


So is your position on security that Apple should just give up? I'm not really clear what you're advocating.


As for everything related to security granted by US companies, there is the elephant in the room that apparently needs to be restated regularity, and further discussions can come with that in mind.

I don't think the parent was advocating for anything related to the NSA, just reminding us of the current state.


Are you certain they don't have Apple's private keys? They have many ways to compel or persuade someone who does have access to those keys to give the keys to the NSA, or to infiltrate Apple with their own agents.


Apple is going to be far more concerned about countries like China, Saudi Arabia etc than the US.

The rule of law is significantly weaker and they have already shown that they will go after even mild dissidents e.g. journalists not just mass-terrorists or other more serious threats.

So sure US may have the keys. But unlikely every country does.


It seems unlikely that Apple and the US government would litigate so aggressively against each other, over hacking or forcing back doors into Apple systems, if Apple had already been forced to hand over the keys to the government.


We're talking about two different systems: the criminal justice system and the national security apparatus. The first system rarely gets help from the second, and the second may even be happy to see such litigation misleading adversaries about the government's real capabilities.


Oh no, there is no way a three letter agency found out Apples secret sauce. Obscurity is the only form of security that they'll never be able to crack!


I wonder how does that prevent NSA and CIA level intercepting and tampering? NSA and CIA can't work with Apple to get their hardware to be signed as well? I don't believe it.


I'm not saying it's realistic, but one could theoretically replace the sensor with a fake one that always returned the same data. That would allow the user to re-setup face ID and let unknowingly let any intruder in.


That’s less secure.


No less secure than the regular OS, which always allows you to use the passcode rather than the FaceID.


Most secure would be for the device to go up in flames as soon as it detects any tampering. I sure wouldn't want to own such a device that could never be repaired.


No, it wouldn't. The sensor is an imager. The back end, which interprets what it delivers, should be separate.

This is more petty Apple bullshit.


I suspect this is actually something to do with the fact that the Face ID sensors are projecting unique patterns across different devices. From the Apple Platform Security document:

> After the TrueDepth camera confirms the presence of an attentive face, it projects and reads thousands of infrared dots to form a depth map of the face along with a 2D infrared image. This data is used to create a sequence of 2D images and depth maps which are digitally signed and sent to the Secure Enclave. To counter both digital and physical spoofs, the TrueDepth camera randomises the sequence of 2D images and depth map captures and projects a device-specific random pattern. A portion of the Secure Neural Engine — protected within the Secure Enclave — transforms this data into a mathematical representation and compares that representation with the enrolled facial data. This enrolled facial data is itself a mathematical representation of the user’s face captured across a variety of poses.

https://support.apple.com/en-gb/guide/security/sec067eb0c9e/...


The back end has to be able to trust the data from the sensor, and the only way to do that is engineer them as part of the same cryptographic trust domain. Otherwise you could just swap the sensor for a dummy device that presents whatever data you like to the back end. That would completely break the security model.

I know a lot of the other phone manufacturers do stuff like this and think it's ok, but Apple knows what it's doing.


How can you trust that the sensor is actually delivering what it sees?


Apple did this with touchid too?


Yes, they did. You had to save the touch sensor during the replacement due to hardware signing.


Great, had my macBook Pro keyboard replaced, and now it’s TouchID sensor is gone.


The security has become a weasel word to justify anti-customer practices. It should be called out! The only reason is profit and most certainly it will create more e waste.


Is this an undescribed security feature or something else?


It’s inherent to the security model due to cryptographic signing coupling the components. If you were able to swap out security components, you might be able to spoof messages - it would no longer be secure. A stolen phone could be connected to a fake sensor and used to trick the phone into authenticating.


How would it hurt that security model if the phone could sign new components only with the permission of the logged in user?

> If you were able to swap out security components, you might be able to spoof messages

But Apple repair shops can do exactly that (unless they always replace the device whenever the screen is broken, but I doubt that). What's the purpose in this whole cryptographic integrity if a third party - certified or not - can violate it without leaving a trace? If I was a person who has to worry this much about their security I'd much prefer swapping the screen myself without involvement of people that can't even be honest about "water damage" most of the time.


> unless they always replace the device whenever the screen is broken, but I doubt that

Why do you doubt that? I'm sure that's exactly what they do.

Edit: Well, by "device" here I read "screen + sensor". If you meant the entire phone then I agree with you heh.


it also must mean the sensor is indelibly attached to the screen now


(video)


I wouldn't expect this from a phone I can pay up to $2000 for.


You can just pay Apple to swap in an original screen. It’s not cheap but it isn’t $2000 either.


It’s actually less insane than it used to be; Apple’s first party repair for screens isn’t significantly more expensive than unauthorized repair shops. There used to be a much bigger price difference but the newer iPhones have so many sensors on the screen that the part itself is most of the cost.

Apple’s site is showing ~$250 for a screen replacement (a little more or less depending on the model).


> Apple’s site is showing ~$250 for a screen replacement (a little more or less depending on the model).

Can you link to this please? I cannot find it.



In Ukraine for iPhone 13 max 1tb you will need to pay $3000


I mean, good? People shouldn't be using that anyway, as police could force you to unlock your phone against your will.


People who don't want Face ID don't have to turn it on. People who do want it shouldn't have it held ransom by Apple just because they need a screen replacement.


I strongly disagree. Bad security is worse than no security. It's better to disable FaceID when the security of the system has been breached rather than to limp on regardless.


Don't jump through hoops to defend this company.

This device spies on your files for "CSAM" or whatever, but you think it's going out on a limb to save you when the screen breaks?

Security theater. But in all honestly it's just a ploy to sell you more garbage that you don't own and can't repair yourself.


CSAM has nothing to do with device security but with privacy. This is off topic. Also, CSAM is entirely optional anyway, you can just not use iCloud if you don't want to be "spied on" by Apple, and I write this in brackets because willingfully sending unencrypted files to Apple doesn't really quallify as being spied on by them.

This is not about defending Apple, I don't support the CSAM scanning feature neither as I find it useless and insulting.


You're not getting it. You wouldn't put your data in a device that leaks it without your consent, would you?

What's the security model what is effectively Hawking radiation? How can we frame a device that talks directly to big brother as secure without mental gymnastics?

Replace "CSAM" with "unapproved thoughts" and look to how Apple caved to Putin just days ago. The CSAM system will be used to benefit the FSB and other intelligence organizations.

Optional today is mandatory tomorrow. This will come back to haunt us.

The device and the company are wholly compromised and morally bankrupt.


a) CSAM detection is already done server-side. Client-side was to increase user privacy.

b) It's only for iCloud Photo Library.

c) You shouldn't use SaaS platforms if you are worried about CSAM detection.

They are required by law to take reasonable efforts to prevent the storage and dissemination of CSAM.


You can force the device to ask for a password by pressing the home button several times or by turning it off.


It's actually slightly easier these days (at least on my phone): hold down the power button as if to turn it off, but don't swipe to actually turn it off, just hit cancel (or home, or power to turn the screen off). The phone will be locked and prompt for your passcode and won't let you use biometrics.


For me (iPhone 11 Pro, iOS 15), holding down the power button activates Siri. The ways I can force lock the phone are either pressing Vol+, Vol-, and power, or pressing power 5 times. But if you have the emergency SOS turned on, the latter will activate a siren and call 911 a few seconds later.


This is a pretty bad idea if you are worried about your phone being seized, just turn it off.


old iPhone 5s here, vol+ plus Vol- plus Power buttons do nothing.


That’s not a scenario most people need to worry about.


Wait, isn’t this the nutjob who came up with the weird conspiracy theory about macbook webcams spying on him?

E: Oh yeah, it’s the same guy. He has an interesting track record https://news.ycombinator.com/item?id=20507037


Wow that's rich, you're trying to pass him off as some kind of fringe conspiracy theorist.

All those points got debunked in the comments. As for that thing with the batteries -- you can't blame people for buying illegally imported and/or counterfeit [citation needed] parts when there is no legal way to obtain them outside of AARP.

Oh yeah also, stuff like this is the reason why Apple is a two trillion dollar company.


> you're trying to pass him off as some kind of fringe conspiracy theorist

He is. Just watch the webcam video.

>you can't blame people for buying illegally imported

It’s only illegal to import counterfeit parts.

>there is no legal way to obtain them outside of AARP

It is perfectly legal to buy generic replacement parts without apple logos, but the chinese factories don’t find it worth their time to produce such as “authentic” parts earn them more money.

And yeah, he ended up admitting it and bragging about buying counterfeit parts.

> Oh yeah also, stuff like this is the reason why Apple is a two trillion dollar company.

These things have nothing to do with each other, but I rather doubt that you even read Apples financial statements.


Also to points I've seen made elsewhere; Apple has begun placing trademark material (ie logos) on individual parts (including ribbon cables, etc) to make use of trademark enforcement for otherwise legitimate third-party refurbished parts. He notes this in his videos discussing why he cannot source screens for phones anymore.

Where previously they could have third parties remove cracked glass from broken screens and re-laminate them to make legitimate refurb parts. However, these were deemed unauthorized refurbs (but not illegal/unlawful to make) so instead they used the logos on cables as justification to have customs seize them as "counterfeits" on behalf of trademark enforcement.


This is nonsense, refurbished screens are not counterfeits. The problem is screens that were manufactured with apple logos without apple authorization.


This is incorrect. FaceID is not disabled if you perform the screen replacement correctly, you just need to transplant the faceid bits between the screen modules.


Not to be that guy, but source?


Do you have a source which says that it doesn’t?


Burden of proof lies on the person making claims. You made a claim now back it up with source.


I mean, that would literally be this post, right? You’re contradicting this video without any proof.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: