Hacker News new | comments | show | ask | jobs | submit login
I recommend against using biometric identification (freecodecamp.org)
267 points by imartin2k 96 days ago | hide | past | web | favorite | 232 comments



> And to be clear, a court in the US cannot force you to give up your passcode. That passcode exists in your head, and yours alone. It is your property, and won’t be used to incriminate you or strong-arm access to your data unless you voluntarily give it up.

While technically true this is false in practice. While they can't force you to provide your passcode they can force you to unlock your phone. Francis Rawls has been in prison for two years now over refusing to decrypt a hard drive.[1] The same principle applies to phones. If a judge finds you in contempt-of-court they can imprison you indefinitely.

1: https://arstechnica.com/tech-policy/2017/09/judge-wont-relea...


To add more examples to this, Florida courts have also ruled that you can be imprisoned for not giving police access to your phone.[1]

The "police can't force you to give up your passcode" misconception stems from a case in Virginia from 2014 [2], and while that may still be the case in Virginia, it does not mean you can just say "my phone is locked with a passcode, fuck off cop" in every other jurisdiction.

1: https://9to5mac.com/2017/06/01/fifth-amendement-passcodes-pa...

2: https://9to5mac.com/2014/10/31/touch-id-police/


Hmm, I guess I could just not carry a phone. Or just have it factory reset every morning automatically and not put any personal data on it. I hardly store anything on my phone anyway and use it pretty rarely so it wouldn't make much difference. What a world we live in.


> What a world we live in.

s/world/country/ fixed that for you.


That man may still be in prison, but that drive is still encrypted.

If you are unwilling to give something you know to someone, no amount of force can take it from you. Had that drive been encrypted using facial biometrics, they could have just knocked him out, glued his eyes open, and taken what they wanted.

What works, and what has been deemed legal, as you probably already know, are not mutually exclusive.


Which is torture, so they could do that until he tells them the passcode aswell.


It's true that torturing a person is an effective way to change what that person wants. It doesn't always end with the torturer getting what they want, though.


Why do people assume that deniability results in more whacking?

Technology can easily be used to encrypt a hard drive to reveal different things for different passwords. TrueCrypt does it.

Plus you can have cryptographic keys stored with friends or beacons that signify you are safe. For example you hide files on your phone before a flight, and to unhide them you need your host's wifi at your destination.

Until the friend or beacon gives the ok, you can either have your phone LOCKED or have all sorts of files that are encrypted and HIDDEN. The computer can be unlocked but won't show those files.

Why isn't this technology widespread? YOU CANT WHACK EVERYONE!


This might not be a commonly held view but in my experience societal norms and values are the only thing really protecting us. If the police come after me and the courts don't stop them nothing will make a difference, even with the best cryptography all they would have to do is threaten my family and I would always unlock it. In my view technological protections of fundamental freedoms should only be temporary stopgaps for what essentially needs to be political solutions.


That's what we did with our TI86 calculators in the 90s. Fake memory Clear was the name of one app. Teacher looks and sees what looks like a formatted calculator, clears the calculator if they want, then you exit the app and access your notes, games, and apps.


> Why do people assume that deniability results in more whacking?

Contempt?


I think, at some point it gets to the Supreme court which will decide whether it's covered by the 5th amendment or not.


The answer is probably no.

Requiring a person to unlock a device is not prohibited by the Fifth Amendment simply because the device contains incriminating information that would otherwise be inaccessible to police.

If the police have a valid warrant to search your safe, you are generally required to unlock it for them, even if the safe contains evidence that incriminates you. If you are issued a valid subpoena to produce certain documents in your possession, you are generally required to produce those documents, even if they incriminate you. Compelled decryption of hard drives is fundamentally no different.

It is true that the act of unlocking the safe or producing those documents is itself testimonial in the sense that you are conveying the fact that you know the combination or possess those documents. But under the "foregone conclusion" doctrine, if the state already knows that implicit testimony, then it is not protected by the Fifth Amendment. It's obvious that Rawls knows the password to the drives.

There are legitimate concerns about how search warrants should apply to electronic devices. However, these are Fourth Amendment issues, not Fifth Amendment ones.

If you're interested, Orin Kerr from the Volokh Conspiracy has written several articles about compelled decryption, including with respect to this particular case [1, 2, 3].

1: https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...

2: https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...

3: https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...


What checks exist to prevent police from planting a USB hard drive on an enemy of the state that's encrypted and then claiming the defendant won't decrypt the hard drive? The state would have a really easy time imprisoning him/her because the defendant would never be able to provide a key to decrypt it.


The check there is the judge and jury, which could be convinced by the defense that this is not their hard drive - unlike the case of Francis Rawls, where they haven't even attempted to contest that he knows the password. They haven't claimed that they are unable to perform the action, they're simply refusing to do it (and unsuccessfully contesting that they don't have to do this), so that's contempt.

It's kind of counterproductive to assume a singular agent "the state" in this context - all legal checks and balances essentially rely on multiple, separate, competing agents of the state controlling each other, separation of powers and all that. If police and all the judges are both on the same corrupt side then there's nothing stopping them from convicting you for murder of Abraham Lincoln and locking you up indefinitely for that, but we're working off of the assumption that this is not the case.


Nothing but morale prevents it, just like planting any other evidence.


I'd say though that if a judge can get a free pass to lock anyone away indefinitely by police planting a USB drive -- by claiming the defendant is in contempt of court -- that's drastically different in terms of payoff on evidence planting.

Planting a gun on a defendant is much harder to do than planting a USB drive. If it really was the defendant's, they probably have munition for it, there's biological contamination etc.

It seems to me there should be an onus on the prosecutor to prove that they defendant has the key and isn't giving it up before the defendant is held in contempt of court: "Here's a video showing the defendant access the banned booked on the USB drive, she won't give up the USB-decryption key!"


Thanks for sharing Kerr's articles! I wasn't familiar with this issue, so I read them, and in my opinion, I think he's dead wrong (and the 3rd Circuit ruling). The argument that by disclosing his password, Doe is only admitting, "I know the password," which is a forgone conclusion, is nonsense. That statement necessarily carries with it a number of additional statements, including "Very few other people (if any) also have this password" by virtue of what a password is, and "I have read/write access to this hard drive," which when coupled with the previous statement, leads to the conclusion "I wrote the material on this hard drive to this hard drive." Kerr's argument is basically "Doe is only admitting to the premise" while ignoring that an entire chain of reasoning necessarily follows from the premise.


With a safe, as with any encrypted document, there are a finite number of keys. I think that invalidates the idea of possession of the password being important, at least from a "purely lawful" point of view.

Read/write access is basically physical access. Anyone with enough resources could accomplish physical access.

It's possible I'm being paranoid, but it seems like most of law is based on series of assumptions. It isn't a purely logical idea, which is why we have so much fun arguing about it. It is fundamentally the hope of writing down logic in a language that doesn't, unambiguously, contain it.


I think the safe analogy is an excellent way to illustrate the issue. Encrypting a file is essentially the same thing as locking it in a safe in what I believe is the ultimate "eyes of the law" once this gets fully tried.


Encryption is nothing like locking in a safe further in a similar situation I'm pretty sure rather than going to court they just open the safe making the example even more useless.


It's a bit like the safe exactly because there's a lot of actual existing precedent where they do not "just open the safe" but require the defendant to produce the key/code to the safe; and instead of attempting to breach the safe, hold the defendant jailed for contempt if they refuse to do so.

In this particular case the authorities are explicitly arguing that there's no good reason to use a different process for passwords as they currently use for safes, and this (requiring the defendant to unlock it) is the standard procedure, not drilling the safe open.


> Encryption is nothing like locking in a safe

Sure, to technical folks like us, but notice I said "in the eyes of the law". Furthermore, police can not open a safe without a court order, so guess your reply was a bust all around?


Do you have a citation wherein encryption is treated like a safe or are you like most people kind of winging it?

I know they can't open a safe or for that matter a door without a court order. The point was that the comparison between forcing open a safe and forcing someone to produce a passphrase was meaningless because the apparently treacherous question of compulsion to testify against yourself wouldn't be tested when I drill could do the job.

I am not a lawyer but I know enough to know that most people in most discussions are full of it and know little. What actually is mysterious is why people believe that their nonexistent expertise adds to the discussion.

Imagine if the matter were technical and a bunch of non tech people, say the kind who get confused about ram and storage, referring to both as memory,or call the entire thing the cpu were volunteering different insights into the question at hand. It would be useless in a funny sort of way.


Sucks to be in jail waiting for that process to play out, though. *edit typo!


It's likely Rawls' case will make it to the Supreme Court. Let's hope it's sooner rather than later.


Fuck, that is absolutely nauseating.

What's worse is that trustworthy deniable encryption - which would solve this - is practically non-existent now that TrueCrypt is gone.


Eh... The prosecution claims, and the judge believes, that they have a list of files they expect to find on the drive. If he gives them a password and those files don't appear, the judge will conclude he gave them the wrong password, and he will stay put until the right password is produced.


So you mean, if he floats, he's a witch, and they burn him, and if he sinks, he's innocent but he drowns?


That's not how this works. With deniable encryption, it is entirely possible that there doesn't even exist a hidden volume to find. They will be unable to prove he has a hidden volume and surely a judge will not compel him for not producing something he may not be able to at all.


Check out, for example, the case of Martin Armstrong, held for 7 years in contempt of court for not producing items he said he didn't have. Eventually he had to enter a plea agreement and serve an additional 5-year term.


Here is another project that is quite simple and works pretty well: https://github.com/dolanor/scubed


You can get this from VeraCrypt (https://www.veracrypt.fr/en/Home.html)


I got the impression that VeraCrypt was not being run very well and that the developers didn't have much security experience based on the audit report that came out. I didn't consider it a trustworthy project. Has anything changed that?


Isn't TrueCrypt 7.1a still reliable?


Yeah, it works well enough - some pain, have to deactivate it to do the major updates and such on Win10, but it does work.


What if you just say you forgot the passcode?


Not necessarily clear, but Rawls did not use this defense. It should be effective but we are in unknown territory.


What if you legitimately don't know the password?


Perhaps there could be a way to instantly delete all iphone data with a voice command.

Something like "siri delete my iphone code alpha nine x."

Or even an undetectable command like "silly sausages" for example.

Might that stop a judge from being able to send you to prison indefinitely?


I would bet the judge would consider that destruction of evidence.

Disclaimer: I never studied law


What if the authorities do not know what the self-erase command is? Like, inputting your password backwards, or adding 1 to every number.

What if it only erases a certain file or partition, so the phone still contains some personal data, but it's like TrueCrypts's plausible deniability?


> The same principle applies to phones.

Do you have a reference for this?


That's F'd isn't that same as Bitcoin private keys too.

I suppose depends, criminal right thing is to help solve a crime or something I don't know.

Seems similar to using VPN in China. Is it illegal to protect/use your privacy.


> Francis Rawls has been in prison for two years now over refusing to decrypt a hard drive.

People have got to stop martyrizing this guy. He's in jail because the prosecution got a fortuitous decision that says they can hold him as long as they want until he coughs up a password. They aren't fishing for evidence, nor have they used this trick on anyone else. If he went to trial on the evidence already in public, the guy would hang (metaphorically). There is no significant doubt in anyone's mind that this drive contains CP.

But the principle is the important part you say? Sure. Make a principled argument. Don't cite this guy.


If it's been proven beyond a reasonable doubt that the drive contains CP, then why don't they just go to trial?

It seems to me that they've intentionally chosen a morally-objectionable individual upon which to build a legal precedent, as anyone who speaks in his defense can have his crime thrown back in their face.

We've heard a lot about these so-called "hashes" that prove the presence of CP. It's also pretty easy to make a probabilistic proof about the likelihood of a hash collision between two non-identical files. I would venture that if you can show mathematically that you're 99.9999999% certain that the files are CP, that would qualify for "beyond a reasonable doubt".

It's incontrovertible that if he's committed the crime for which he's been accused, he should be jailed. If they've already proven that crime, then why isn't he already serving his sentence? Or does the prosecution's case rest upon this one piece of evidence, and if so, is he therefore required to testify against himself in order to avoid indefinite detention?


It's worth noting that despite the prosecutors saying "its a foregone conclusion", they have not actually even charged Reynolds with possession of child pornography. It seems to me that while their words say they already have proof, their actions say they don't have any.


> He's in jail because the prosecution got a fortuitous decision that says they can hold him as long as they want until he coughs up a password.

You call it a fortuitous decision, I call it precedent.


Never?

If Jason Bourne is after you that's probably true. If you're worried about border security, that's maybe true.

But for most people, the lock on their phone isn't protecting them from the government, it's protecting them from nosy relatives, a pick pocket, or the guy that finds the phone you left at the bar, or their 4 year old. None of these 'attackers' will ever be sophisticated enough to defeat the biometric protections on an iPhone.

I think people need to adjust their security policies to reflect the actual security threats they're likely to face, and for most people Touch ID or FaceID are more than adequate.


The big thing about TouchID and FaceID is that they are great ways of enforcing a higher level of security than nothing... Prior to these technologies many people I knew did not bother having PINs or were using 1234 because remembering a complex PIN or having to type in something long is too laborious.


Bingo. Numerous times I've had to explain to my parents why using their wedding anniversary for everything is a really bad idea. For folks like us, yeah, we can see issues. But for my parents biometrics is the best worst option.


They are even less convenient for lending a phone to a friend or relative though.

These features are change; not a straight improvement. They have different pros and cons compared to a passcode.


You can still use a password though with Touch ID.


I agree that convenience is the real test of each of these technologies (along with "good enough" security) that lets the majority of people to have a good experience.

The biggest concerns for the iPhone (or others) then are things like viewing angle, sunlight, etc...

Also, if I were an identical twin (only 0.3% of the population) I would be a bit unhappy that my brother/sister could post anything they wanted on my IG/FB/SN.


What about Halloween when everyone is in costume? Will I be able to access my phone without taking off a mask?


I don't think you can have a identical twin that is a different sex.


I don't have an identical twin, but I also don't know which hypothetical genetic sex so... male or female, depends on you.


The US border extends 100 miles inland around the entire perimeter. 2/3rds of the population are subject to federal overreach anywhere in that zone.


"sophisticated" here could be as simple as buying a mass-produced 3d filter sized for the dual lens on the iPhone, installing a companion computer program, running it, uploading a video, and then pointing the phone at the screen. If I were your nosy relative, that certainly wouldn't stop me.

As with any security break, the first research prototypes may sound sophisticated, but they might not be that far off from practical mass-production.


I could just knock you out, get you drunk or drug you and just scan your face. Ive seen people get into a drunk passed out persons phone by picking up thier finger and putting it onto the sensor.


I mean, cool, but I'm more worried about someone shoulder surfing with a camera or in person when I type my passcode in


I like how easy it is to unlock my phone with my fingerprint (back-facing design instead of iphone front-facing design), I'd say the same about fingerprints(don't use it) but your fingerprints are probably all over everything that you own.


> I think people need to adjust their security policies to reflect the actual security threats they're likely to face, and for most people Touch ID or FaceID are more than adequate.

Sounds like you work for Equifax. (=

Look, real security threats are out there -- even if you don't want to acknowledge them. Phones have too much sensitive data, photos, bank accounts -- now the ability to pay via text message. It's just short-sighted to think the only threats people should be worried about are their kids or neighbours.


Sure, but has there been a single case of someone's bank account being robbed because they lost their phone, and someone went through the effort to collect and impersonate their fingerprint to unlock it?

You could be shot at random too, but I'm betting you didn't wear a bullet proof vest today, unless you have cause to think someone is determined to harm you.

And don't I wish that my credit info requires a fingerprint to access. That'd be a big step up!


Certainly there have been people with devices that had poor security that were remotely accessed and then robbed.

TeamViewer accounts with weak passwords were being hacked en masse with scripts trying to visit paypal.con and transfering funds. If you had passwords auto-saved in your browser they could get in to make the transfer.

FaceID is better than 99.9% of what humans actually do in the real world.


The issues raised in the article may explain why Apple just added the ability to passcode-lock your device by pressing the power button 5 times.

Though people have been raising similar issues about biometric identification for years. See this article from back when TouchID was released 2013, titled Fingerprints are Usernames, not Passwords. http://blog.dustinkirkland.com/2013/10/fingerprints-are-user...


Ehh... it's not the same as a username.

It's more like fingerprints are door locks. Any determined thief can get around it. But it protects you from people who aren't really all that determined. And for most people door locks are sufficient. But if you are a major crime lord, protecting something extremely valuable, or just really into security then door locks are not enough.


The "door lock" analogy ignores the biggest flaw with fingerprints: they're forever.

If your door lock is compromised, you can change the key. If someone steals your password, you can change the password. If someone steals your fingerprint, you can never change your fingerprint (same with your face).

The other stuff is dead-on: its a "good enough" security measure for phones. But as a security practitioner, the biggest problem IMO is that Apple using TouchID and FaceID is giving the general public the wrong idea about security. Apple claims that these innovations are "cutting edge" security, and so consumers buy into this and then also use fingerprints to secure things like their bank accounts, work logins, password vaults (this is a big one - someone steals your phone and you use your fingerprint to access your LastPass account, which has all of your passwords in it? And your phone is also your 2FA device? You're screwed.) etc, where they really aren't "good enough" at all.

I've worked at companies where we disabled fingerprint logins on certain devices because highly sensitive info is held on those devices, and fingerprints just aren't secure enough to protect them. Then we get yelled at by people from the company because "Apple says fingerprints are the best for security, why aren't you letting us use them?" It's a pain.


> ignores the biggest flaw with fingerprints: they're forever.

The second biggest flaw being your phone is covered in your fingerprints!


It's a sad state. I've heard wealthy and influential investors talk about how they don't think real 2FA is worth anything, because they just want to use their finger for everything. No matter how easy or hard it is to steal, the major problem is that you only have 10 fingers. If all of them gets compromised we still need something else.


> you can never change your fingerprint

Not quite true from the phone's perspective. Most people have nine backups to fall back on if they really need to.


>If someone steals your fingerprint, you can never change your fingerprint (same with your face).

Because you expect repeated attacks from the person who stole your fingerprints? Who are you, James Bond?


It doesn't have to be a repeated attack from the same attacker.

Imagine your fingerprint data is leaked to hundreds of hackers.


And many of those hackers (or even one of them) care enough to (a) steal your phone, (b) fake your fingerprints with a cast or whatever?

Yeah, I'll risk it...


The OPM hack resulted in millions of people's fingerprints and names being hacked, and now are floating out on the internet for anyone to look up.

Individuals who had their fingerprints stolen in that hack can now never use fingerprint readers with any reasonable confidence, since now all a hacker has to do is search that person's name and pull their fingerprint from one of aforementioned databases.

> fake your fingerprints with a cast

Fingerprint scanners like those on phones have been shown to be able to be fooled by using $10 worth of office supplies and some play-dough. It's not like we're talking mastermind levels of intelligence to do this stuff.

Of course, all of this completely ignores the fact that your phone likely already has several copies of your fingerprint already on it since you touched it, so it's not like someone hacking your fingerprints is even necessary. That's an entirely different reason of why fingerprint security is abysmal, though.


People keep saying fingerprints are all over the internet but I have seen no actual proof (1) how you can steal an iPhone fingerprint record (2) how you can use this data to generate a fake fingerprint sufficient to open the iPhone or even (3) copy a fingerprint off of the outside of the phone and open the iPhone.


1) you don't, but the iphone secure enclave is not what he's talking about. He means fingerprints on the glass.

2) google "touchid hack" there's videos on YouTube.

3) not super likely as usually you'll only find rough partials, but as previous poster mentioned, there has been government hacks that have leaked biometric data.


Ok, how about this. Imagine if MILLIONS of fingerprints are leaked, in some sort of wide net security break, and now any script kiddie can hack ~50% of phones?


I imagine it. So? How exactly is this kiddie (or any kiddie) gonna also GET my phone?

(That said I never had a phone lost or stolen. People who tend to have them so might feel differently).


https://www.xkcd.com/538/ applies.

Neither Touch ID nor passwords keep determined intruders out. If someone really wants to know what's on your phone, they will arrest/kidnap you and threaten you with prison/violence.


No security is going to keep "determined" intruders out. But the point is that you should still strive to achieve "good enough" security.

The problem is that while the actual ranking from least secure to most secure is "nothing < touchid/faceid < passcode", Apple's marketing and implementation gives people the false impression that its "nothing < passcode < touchid/faceid", which is bad for security.


I think "nothing < passcode < touchid/faceid" might be true for a startling number of people. I've seen many people with ridiculously easy passcodes and even funnier Android patterns (e.g., one of my colleagues uses his first initial as his Android unlock pattern, and my mom uses her dog's name as her passcode).

So Touch/FaceID isn't better than a good passcode, but maybe it's better than a crappy passcode.


And TouchID/FaceID that people use is way better than passcodes they do not because they're a pain in the arse.

I noticed a distinct improvement in the speed of the TouchID unlock going from an iPhone 6 to a 7, which pretty much reduced all friction to me using it. Apple's marketing fluff suggests FaceID will be "twice as fast" as TouchID.


I could be wrong, but doesn't a passcode actually encrypt the data (for sure on password manager/banking/etc apps) whereas FaceID/TouchID/<insert biometric here> doesn't? And what about hashing? AFAIK you can't really hash biometrics.


With Touch ID and Face ID, you are required to have a passcode. What's the point of Touch ID if it fails and doesn't have any other way into the phone? As for hashing biometrics, Apple has the Secure Enclave which is for storing the biometrics.


Isn't there a danger of providing 10 wrong passwords and thus trigger the data deletion builtin ?


Actually fingerprints are relatively easy, if painful, to change.

Your point stands, of course.


>If someone steals your fingerprint, you can never change your fingerprint (same with your face).

At what point is stealing a fingerprint, retina print, or face going to be economical enough for the thief that this would be an actual valid concern in 99% of use cases? Both FaceID and TouchID need to read a living person with a pulse in order to authenticate. You can't just take a printout of a fingerprint and drop it in. This is a really heavy lift to try to jack some random person's phone. Unless you're securing State Secrets or occupy rarefied enough heights that you have a Swiss bank account I don't really see anyone bothering.

>and so consumers buy into this and then also use fingerprints to secure things like their bank accounts, work logins, password vaults

Which bank accounts are taking fingerprints? Do you mean people's banking apps on their phones? In order to get to that they would need to steal both your phone AND your fingerprint. If a thief is this enterprising your info. is lost anyway. And again, they would need an extremely high fidelity reading of your fingerprint and the ability to reskin a living finger with it. And they would have to execute all this before you get to an Apple Store or a PC to remotely shut it down.

>Then we get yelled at by people from the company because "Apple says fingerprints are the best for security, why aren't you letting us use them?" It's a pain.

This often happens when someone shoves policy down people's throats without explaining themselves or getting buy-in from their clients. This is a communication skills problem, not an issue with biometrics.


> At what point is stealing a fingerprint, retina print, or face going to be economical enough for the thief that this would be an actual valid concern in 99% of use cases?

For the average person who is just securing their phone that only stores pictures of their cat, this isn't a concern, but that's far less than 99%. For pretty much anyone who is logged into their work email/VPN via their phone, or is using fingerprint scanners to secure their work laptop, this is a very real concern that I have seen exploited a few times in the real world.

> Both FaceID and TouchID need to read a living person with a pulse in order to authenticate.

TBD with FaceID, but with TouchID this isn't the case. You can defeat TouchID with $10 worth of office supplies and some play-dough.

> Which bank accounts are taking fingerprints? Do you mean people's banking apps on their phones? In order to get to that they would need to steal both your phone AND your fingerprint.

Since your phone literally has your fingerprint left on it from when you touched it, this isn't really a difficult task.

And as I mentioned, it's even worse if you're one of the people who uses a password manager on your phone that is also locked with fingerprint. Then, every account you have is now compromised. And even if you're using 2FA, your phone is likely your 2FA device, which the thief also has.

> This often happens when someone shoves policy down people's throats without explaining themselves or getting buy-in from their clients. This is a communication skills problem, not an issue with biometrics.

No, it is undeniably an issue with biometrics (and the way they're treated). Training and awareness (communications) is one of the primary problems that any security implementation will try to tackle, but it's just made more difficult to do that when Apple is pushing falsehoods like "TouchID is the most secure thing ever!" in all of their marketing materials.


Biometrics can't be rotated. But they also can't be phished. People have been using "biometrics" to recognize people they trust since the beginning of time, and are pretty rarely fooled. They have also been using passwords since the beginning of time, and have been being compromised since the next day, when someone walked into the enemy camp by accosting a patrolling guard and demanding the password.

The most important factor of authentication protecting a mobile device is just possession of the device. Fingerprint or face unlock adds what so far in practice seems to be a decent layer of security. Eventually I expect that it will be improved a lot by greater situational awareness on the part of the device: you won't just have to steal the phone and fool the 3d camera, but do both without ever letting the phone see, hear, or otherwise sense anything suspicious. Which is probably getting into mission impossible territory in most situations.

But even without that, in practice I think your corporate secrets would be considerably better defended by something like face id and device identity than by, say, a password and a regular old 2fa token that are both easily and simultaneously and remotely compromised by sending the target an email from yourcompany-itdept.com asking them to log in.


> Biometrics can't be rotated. But they also can't be phished.

Sure they can. Haven't you ever seen a cop show where the detective tricks the suspect into drinking from a cup of coffee so they can lift the suspect's fingerprint from the cup?

"Hi John, nice to meet you! * shakes hand *" I now have John's fingerprints from where he touched me when he shook my hand.

"Hey John, can you send me a selfie?" I now have a picture of John's face and possibly his iris.

Hell, I bet it won't be long at all until someone finds a way to use the iPhone X's own "TrueDepth" camera to record a 3D scan of the user's face which can then be used to fool FaceID.


They can't be phished because they aren't secrets. Yes, if you think of a biometric as a password it is an awful password. But it isn't; its primary source of security is the difficulty of presentation. You should not rely on the secrecy of your biometrics.

You probably don't worry very much that your loved ones have been replaced by impostors, and the reason is not that their appearance is secret! It's just that fooling your face, voice and other "biometrics" without making you suspicious would be, depending on the situation, somewhere between technologically impossible and way more expensive than it would be worth.

A secure biometric is one for which spoofing the sensor is as difficult or expensive as compromising the device hardware some other way. I agree with you that touch ID doesn't quite meet this standard, largely because device hardware has gotten much more tamper resistant in recent years! Hopefully face ID will be better. I can easily remember when it seemed absurd that normal consumer devices would ever have a chance of resisting compromise by a sophisticated adversary that had the device in their possession!


> They can't be phished because they aren't secrets.

And here lies the problem. Apple treats them as if they are.

"Your fingerprint is one of the best passwords in the world" - Apple during the keynote when they introduced TouchID[1]

"Your face is now your secure password" - Apple during yesterday's keynote introducing FaceID[2]

1: https://youtu.be/X5zt1V7H88I?t=227

2: https://youtu.be/K4wEI5zhHB0?t=109


I wouldn't have put it that way, but the claim Apple is making (in baby talk) is that these are good authenticators, not that they are good secrets. I don't think the average person in their audience has a strong reason to understand the difference. If people were used to biometrics and you tried to get them using passwords, then it would be critical to explain the difference (if you tell the wrong person your password, it loses all its security!)

The mistakes you can make by misunderstanding biometrics seem like more of a problem for system designers, who hopefully don't get their whole understanding of security from Apple keynotes.


> I don't think the average person in their audience has a strong reason to understand the difference.

In my experience as a security consultant, one of the biggest problems (and it's a very big problem) we face is that average users lack training and awareness of good security principles. It's really bad to rely solely on system designers for your security. Even if your system designer is 100% effective, it just takes one unaware user to do something bad such as give their password over to a phishing call and you're screwed. And if for nothing else, training and awareness is necessary because without it, you get users kicking and screaming when they don't understand why you've implemented certain security features, which typically means you end up implementing less security to avoid the kicking and screaming.

And just like in your average security training and awareness session you'll have a lesson on "don't give your password to someone on the phone, even if they claim to be your IT guy", we also have lessons on "fingerprints are not passwords, and you should not use them as such", but this is hard to get through people's heads when Apple's marketing material says otherwise (as shown in my previous comment).


Can you please cite sources where Apple marketed TouchId this way? At most Apple was trying to get people to secure their own phones to start with.


Unfortunately I can't find any archives of Apple's website advertising TouchID when it first came out, but as I remember it was touted as "revolutionary, most secure way to protect your phone", etc. Below[1] is the keynote from 2013 when it was announced. At one point the speaker says "Your fingerprint is one of the best passwords in the world." He also says stuff like "this is the most advanced technology ever in an iPhone", refers to TouchID as "very high level of security", etc.

The FaceID marketing is the same. The iPhone X advertisement released today[2] says "your face is now your secure password". The website says "Face ID is so secure you can use it with Apple Pay". During the keynote today they actually even said up until FaceID, TouchID "was the gold standard". About FaceID they said "FaceID is the future of how we will unlock smartphones".

You'll note that nowhere in any of it's materials or even in the deep recesses of it's website does Apple acknowledge that even though Face/TouchID is great, it's still not as good as a strong passcode. The closest they come is during the key note they acknowledge "nothing is perfect, not even biometric", but you'll notice that even this statement subtly tries to imply that biometrics is the highest security available ("not even biometrics").

1: https://youtu.be/X5zt1V7H88I?t=227 2: https://youtu.be/K4wEI5zhHB0


>For pretty much anyone who is logged into their work email/VPN via their phone, or is using fingerprint scanners to secure their work laptop, this is a very real concern that I have seen exploited a few times in the real world.

You're conflating every fingerprint scanner with the Apple's implementation of TouchID, which is far more secure than the check-the-box-to-win-a-government-contract stuff that's been built into most laptops. If it's that important, use a biometric print AND a PIN. Easy enough no? At least way easier than requiring an absurd password requirement that people wind up writing into a post-it-note anyway.

>You can defeat TouchID with $10 worth of office supplies and some play-dough.

And by having a person physically press the correct finger onto your obvious fingerprint-stealing device. . .k But if someone has the power to compel you to do that, they have the power to compel you to just put your finger on your phone for them.

>Since your phone literally has your fingerprint left on it from when you touched it, this isn't really a difficult task.

This is even more involved. This involves having to lift the print with a high fidelity scanner and create a latex mold of it. What are you securing on your phone where this is a concern? And what do you think they're going to do when there is a face-scanner or retina scanner? I suppose they could just clone you, wait however many years for the clone to mature, and then use it.

> it's even worse if you're one of the people who uses a password manager on your phone that is also locked with fingerprint.

Maybe if fewer services forced people into using inane and impossible-to-remember passwords and just relied on biometric authentication instead folks wouldn’t need password managers that are so easy to unlock. Not everything needs the level of security of my bank-account, and when ever service a person interacts with wants to pretend they're a bank or credit card then it makes people take their bank or credit card's information less seriously than they need to out of sheer fatigue.

Security should be about fostering secure behaviors and culture in your users, not just ramming the most technically secure set of rules at people regardless of the context. That just makes people behave in insecure ways, like what you're talking about, because you haven't bought them into the importance of the big picture.

>it's just made more difficult to do that when Apple is pushing falsehoods like "TouchID is the most secure thing ever!" in all of their marketing materials.

I don't understand how you derived THAT from this: >Much of our digital lives is stored on our Apple devices, <b>and we recommend that you always use a passcode or password to help protect this important information and your privacy. Using Touch ID on your iPhone, iPad, and MacBook Pro is an easy way to use your fingerprint instead of a password for many common operations.</b>


> You're conflating every fingerprint scanner with the Apple's implementation of TouchID, which is far more secure than the check-the-box-to-win-a-government-contract stuff that's been built into most laptops.

No, I'm not. TouchID is the most popular implementation, and because it's present on every iPhone (which is the most common device to be a work phone, and thus also connected to work email and work networks), and because TouchID is also insecure, thus arises the problem.

> If it's that important, use a biometric print AND a PIN.

This is not possible on the iPhone, and wouldn't solve the problem anyway: consumers are under the false impression that fingerprints are the best security available, and they become frustrated to learn that Apple has been lying to them when corporate IT tells them fingerprints actually suck and they can't use fingerprint locks (or have to use fingerprint + something else) if they also use their phone for work stuff.

> And by having a person physically press the correct finger onto your obvious fingerprint-stealing device. . .k But if someone has the power to compel you to do that, they have the power to compel you to just put your finger on your phone for them.

What? No, you don't. You're just making stuff up now. You can steal someone's fingerprint by simply having access to something they touched, and then you can duplicate it with $10 worth of office supplies.

> This is even more involved. This involves having to lift the print with a high fidelity scanner and create a latex mold of it. What are you securing on your phone where this is a concern?

Network access to a corporate environment that has millions of SSNs, credit card numbers, etc. You think that a few hours of fiddling around with a latex mold is "too much work" for this? Think again.

> Maybe if fewer services forced people into using inane and impossible-to-remember passwords and just relied on biometric authentication instead folks wouldn’t need password managers that are so easy to unlock.

You miss the point. This wouldn't solve the issue at all, and would actually worsen it. Fingerprints are inherently insecure. Using fingerprints for more accounts is, thus, more insecure.

> I don't understand how you derived THAT from this:

I "derived" it from years of experience working as a cybersecurity consultant where at every company someone complains that "Apple says it's secure, so you must be wrong". Watch the keynote. Apple refers to TouchID as "the gold standard", "one of the most powerful passwords in the world", says "it is the most advanced technology", calls it "very high security".


>(which is the most common device to be a work phone, and thus also connected to work email and work networks)

So your VPN isn't adding any extra layer of auth? This doesn't seem like a TouchID problem. . .

This is also a security design problem. You shouldn't be transmitting sensitive information via e-mail. If I want sensitive data stored in your e-mail, I'd start with a phishing attack long before I decide that physically jacking your phone and coming up with a complicated finger-print stealing process is the way to go.

>This is not possible on the iPhone,

To unlock the phone. You can add addition layers after the iPhone auths all you want.

>consumers are under the false impression that fingerprints are the best security available, and they become frustrated to learn that Apple has been lying to them when corporate IT tells them fingerprints actually suck

Heh. If your clients trust Apple's marketing more than their own IT people this again speaks to me of a severe communication problem among the IT people.

>You can steal someone's fingerprint by simply having access to something they touched

Not with high enough resolution to reliably fool TouchID in few enough attempts to keep it from passcode locking you out. You're fixation on worst-case scenarios where your adversaries benefit from multiple passes of blind luck doesn't make for great or realistic risk-assessment.

>Network access to a corporate environment that has millions of SSNs, credit card numbers, etc. You think that a few hours of fiddling around with a latex mold is "too much work" for this? Think again.

And you're not monitoring for suspicious activity or any additional access control on a data source that has all that sensitive information? You're just letting people mosey on into it with just their phones without so much as a warning flag going up somewhere?

>Fingerprints are inherently insecure. Using fingerprints for more accounts is, thus, more insecure.

This is both highly simplistic and wrong. For one thing, passwords are also inherently insecure, especially when people write them on sticky notes and put them under their monitors. Secondly, not all accounts need maximal security. Not all activities within an account need to give people access to the maximal extent of their privileges. Insisting on going all out on every single thing people try to do fosters insecure habits and insecure system design. You're making the problem worse.

>Apple refers to TouchID as "the gold standard", "one of the most powerful passwords in the world", says "it is the most advanced technology", calls it "very high security".

None of which is false for the use cases they're talking about. You're talking about access to sensitive PII, which Apple did not tell you to gate behind TouchID. I also have years of experience in Infosec and setting the record straight on things takes all of 30 seconds of explanation and taking the time to understand their business context. All it takes is to not treat your clients with contempt.


> Unless you're securing State Secrets or occupy rarefied enough heights that you have a Swiss bank account I don't really see anyone bothering.

You're vastly underestimating how valuable access to a person's phone can be. It's not just about quickly wiring money or stealing state secrets but also about building blocks for social engineering campaigns, ad/app fraud, extorsion and all sorts of different things.

And the petty thief who steals your phone doesn't need to have the tools to spoof the biometrics. There just needs to be some criminal organization that does and that's willing to pay petty thieves for stolen phones.


>And the petty thief who steals your phone doesn't need to have the tools to spoof the biometrics. There just needs to be some criminal organization that does and that's willing to pay petty thieves for stolen phones.

And have a pipeline that can buy and move stolen phones fast enough to crack them before the owners can remotely wipe them.

Amazon would kill for that kind of logistical capacity.


I get that all of this is valid in theory, but has there been even one single case of a thief, criminal or law enforcement organization actually using biometric data to unlock a phone?

Obviously past events are no guarantee of future, but still — most advisories like this frankly come across as fearmongering.


I think the idea is that legally it is closer to a username.

A judge can allow the police to knock down your apartment door through a warrant. But they can't compel you to speak and incriminate yourself.

Much the same, they can force you to reveal your fingerprint, but cannot compel you to share a password.


> Much the same, they can force you to reveal your fingerprint, but cannot compel you to share a password.

Unfortunately this is dependent on your jurisdiction. In Virginia it's been ruled that law enforcement can't force a password out of you, but in federal court and in other jurisdictions (Florida), they can imprison you indefinitely for not revealing your password.

The justification used is that the password itself doesn't incriminate you, it's just a password. The stuff that would be revealed with the password might be incriminating, but that's different.


Do you have a federal court decision to cite? Your other comments pointed to two state court decisions (FL and VA).



Who you are, what you have, and what you know. Those are what we need to have good security. Fingerprints confirm who you are. What you have is the phone, in this case. What you know is the password.

If you're concerned about security, use the print/face and the password.


Fingerprints do not confirm identity.

A phone is not what you have in the context of this discussion. What you have would be an NFC, for example to authenticate to the phone. A phone is only what you have, when you use the phone as evidence to authenticate into a different system.


Fingerprints don't confirm who you are though.

In certain circumstances they can, but a phone sensor accepting whatever input it is being given isn't one of those circumstances.


They do an adequate job for the vast majority of use cases. I guess you could hire a security guard to walk with you and ensure your ID matches your physical traits?

Nothing is ever completely secure and usable. There will be some trade offs.


At a minimum TouchID/FaceID/Similar tech, helps users that are too lazy or technically challenged or don't consider setting a passcode due to various other reasons (for e.g:- old people who have memory/Alzheimer's etc.,) provide for a some kind of security. Its better than not having any security at all.


If the Equifax leak and the concerns around not being able to change ssns teach us nothing, it's this.


I would personally like to have groups of things that can be unlocked - that I can define

- Nothing - essential what's on lock (weather, maybe news headlines) - Face - basic stuff - games, calculator, News apps - Fingerprint - mail, calendar, text message, browser - Pass code - banking, settings

A one all seems backward - there are something things I don't want to protect at all (don't care if someone can access) on one extreme, and things that MUST be protected as much as possible on the other extreme.

I get leaking between apps is an issue, and there are other problems around this - but this approach seems more reasonable

And yeah, for some users (my parents) they just want something simple and don't want to deal with this. So face or fingerprint is a lot better than no code, so this is still an improvement


I am not sure why phones haven't been made with different profiles. Yesterday (?), someone here mentioned they wanted to be able to give the (presumed) cops a phone that was blank. I pointed out that was a horrible idea, but didn't really explain why.

If it is a totalitarian regime, they'll just kill you. If you're ever really in such a situation, a blank phone is probably the worst thing you can give them.

Instead, why not a dummy profile that's complete with user activity, social media presence, and showing active harmless use? Why not multiple profiles?

For the rest of us, those who are not spies traveling in totalitarian regimes, what this means is you can hand someone your phone to let them use it. It means you can let your kid use it and not expect to get it back with problems. You can even make the profiles based on the password, so that it only appears to have a single account.

Realistically, the biggest threat is theft. This doesn't hinder theft protection at all. It can still have the same protections, while just offering additional profiles.


Android has pretty good profile support, I have my own profile, a guest one which is wiped when you logout, and one for my kids which can't buy things.

Works pretty well for me, there's a little profile icon in quick settings to switch


Nice! That must be a new feature? It had no such thing, the last time I used Android.

Err... I use a Windows phone, even though I'm normally a Linux user. I kinda like it.


It is present on my Nexus 5 running 6.0.1, and absent on my HTC One A9, running 7.0.

I suspect that it's a feature dropped from many/most customized versions of Android.


I’ve seen it on Lineage OS, but it might be tablet-only.


Its available in Lineage OS for mobile as well.


V. 5.x. Lollipop.


> Instead, why not a dummy profile that's complete with user activity, social media presence, and showing active harmless use? Why not multiple profiles?

And where do you suppose this data will come from? Maintaining something of a plausible and active social media presence is not without it's efforts, nor is creating a profile that would stand up to some scrutiny.

If people aren't really looking it won't matter much, but if they are and getting something that seems fake it might end getting you in much more trouble.


Presumably, if the person thinks it is an issue then they will make an effort to create and maintain it. It'd not be much use for most of us, but it might be invaluable to someone else.


This is the sort of thing AI could do trivially.


>I am not sure why phones haven't been made with different profiles.

Because they're personal devices. And even if they had, 99% of the population wouldn't even know how to begin using them (like they don't have an extra profile on their laptop).

At most phones could use an easy "don't let the person I gave my phone to check some pic see my dick-picks" mode or similar.


I figured granular security has more of an enterprise appeal than consumer, and I still don't really see it.

Email, for example. Day-to-day our normal authentication should cover what's in my inbox and/or the last few months of messages. A "deep dive" of emails from 10 years ago should probably have a second level of authentication. You don't access them that often. Yet, once your compromised your whole history of emails can get slurped up very quickly.

I pointed out to my wife not to email anything with our ssn to our tax guy. She kind of balked, but I pointed out if in 10 years he's compromised it's probably still in his email and trivial to scan for ssn or tax documents.

It's been years, but I was at a company that switched to an auto-delete policy after 90 days or something. I thought it was compliance related, but I also think they encouraged you to store important messages in a local inbox which would seem to contradict that.


Obviously facial recognition and fingerprints aren't as good as a passcode. But they're better than the previous alternative, nothing. Before fingerprint/facial recognition, for the most part the only people who used a passcode were forced to because it was a company phone.


Ideally there would be a way to use both, as a two-factor auth mechanism. CopperheadOS supported using fingerprint + passphrase/code briefly but it broke when they moved to Android 7 and they never could find the resources to fix it.


I see people saying Apple should allow fingerprint plus passcode, but I've yet to hear someone explain how it would work if it can't read your fingerprint? A longer passcode? Why not just use the longer passcode in the first place.


How does it work today if you just enable fingerprint authentication (I'm asking because I don't know) ? Do you also have to set a backup passcode to use in case it can't read your fingerprint? Can you register multiple fingerprints in case you decide to put your main index finger too close to a sanding belt?


On iPhone, you can register multiple fingerprints, but if it can't read your finger within 5(?) tries, it requires a passcode. It also allows you to skip the fingerprint and just enter the passcode if you want. That's useful for when you ask someone you trust to find something on your phone for you.


I see. So maybe a combination of fingerprint + shorter passcode to unlock, or a long passphrase to bypass?


And if you reboot your phone you must enter the passcode in order to enable TouchID after that.


Sounds very similar to Android too.


Fingerprint and pin to unlock, or a passcode.


I'd like to add a feature to the FaceID, requiring the user to wink instead of looking with both eyes open, or have a customized facial gesture, which only the user knows.

It adds an extra layer of security. Not only that, you get to wink at your phone often as a sign of affection (LOL).

Instead of winks, one might choose to do other facial gestures such as stick their tongue out, do a duck-face, etc.


And it’s defeated by someone just watching you unlock your phone in public once.


Except it isn't, assuming the tech works as described. Face ID uses a 3D reconstruction of your face (now we see where that Primesense acquisition went). You can't bypass it with a video, or a photo, because that would be presented to the phone as a planar surface. You could conceivably create a model of someone, but adding a gesture would make it incredibly difficult to mimic.


As is a passcode?


A passcode that you type in fast seems a bit harder to record than looking at someone who winks at their phone


It would be nice to have two gestures. One to unlock, and one to force passcode mode.


Op completely misses how insecure a four digit pin is for prying eyes. If I work in the same office as you, or share any space with you at all, I can pretty much guarantee I can easily sneak a glimpse at your pin when you enter it.


Exactly, thieves can decide which phone to steal after seeing the PIN (think about a bus or subway in a rush hour.) It's possible to get many pictures of someone if you know who s/he is. It's hard to collect fingerprints, at least for the average thief.

If you want to protect against police, whatever the reason, then PIN is ok butbut be careful when entering it in public.


I feel like if they're determined enough the type of lock you use is irrelevant.

The only reason I have a lock on my phone is so if I lose my phone somewhere randoms can't access it.

If they're going to mug me for my phone they can mug me just as easily for my passcode or my fingerprint.

I guess there is an argument for pick pockets but that's a lesser concern for me. Anywhere it's likely to happen my hand is usually on my phone.


1. It's 6 digits now, and not 4.

2. I can also better hide myself entering the pin

3. As mentioned in other places, unlike a fingerprint or face; you can easily change pin codes.

In an ideal world, facial recognition or touch id would serve as identification in addition to the pin code.


https://www.schneier.com/blog/archives/2016/11/using_wi-fi_t...

Pins are basically useless to anyone determined, no matter how much you try to hide it. There's no real bullet-proof security except for a really long password, but how many people are going to do that realistically?


> WindTalker presents a novel approach to collect the target's CSI data by deploying a public WiFi hotspot.

If you don't join public WiFi hotspots, this isn't a danger.

Yes pins are not foolproof. They are however still better than facial scans or thumb scans.


Oh, ha! In the winter when wearing gloves or if my hands are wet, I often swipe my phone open or click to answer a call using the tip of my nose. Guess that's not what this advice is about, though I was briefly happy to think sufficiently many other people had this habit to warrant an cautionary article.


The author doesn't seem to understand the difference between an iris and a retina, but he expects us to heed his advice on the topic of biometric identification.


Indeed.

For further clarity: The retina is the thin layer of cells at the back of your eye that pick up light. The iris is the colorful ring on the front of your eye.

A retina scan is thing most people wouldn't experience outside of an eye doctor's office. It requires really close proximity with the scanner, and it's very clear that's it's happening.

Phones, including the Galaxy 8 that the author of the article mentions, use iris scanning.


It's easier to watch people entering their PINs on overhead security camera footage than it is to cheat their biometrics. Dedicated attackers have simpler, more effective options than having to hack your biometrics.

Yes, your fingerprint and faceprint are irreplaceable. They're kept device-local for more reasons than just Apple Pay. But make no mistake: It's simpler to record you entering your PIN surreptitiously than it is to hack your biometrics.

Are the attacks against it "likely" or "unlikely"? They're clearly aware of the "photograph" and "Mission Impossible" scenarios, and demonstrated visible proof of their time spent ensuring they're refused. If you're under directed and specific attack, that's the best you can hope for from technology! It's not human. It can't magically evolve defenses against humans with time, patience, and hacking powers.

Devices are only safe when a human takes care of them. Your phone, your computer, phone companies, credit bureaus. When you don't take care of technology, someone will eventually exploit it, usually for greed.

Which is more likely:

Someone makes a cool mask device that can hack faceprint, and simultaneously gets sued by Honda Robotics for violating one of their many, many patents on lifelike robotic faces. They use it to hack you, and somehow materially impact your life through hacking your device.

Or, someone hard-resets your phone while you aren't looking, videotapes you entering your PIN confusedly, and then steals it.

A dedicated malicious attacker will always take the second path, because this biometrics crap is useless when you can just get people to blindly enter their PIN as if somehow it's safe to do so anywhere.

TL;DR: Enter your PIN in a bathroom stall, or it's on the security tapes of the mall. If you do this, don't enable Touch ID or Face ID. Problem solved.


How long does security footage last though? It takes only like 3 seconds to type in a PIN so they'd need to keep enough frames to cover that (aside from being lucky enough to have your phone visible to a camera) and I don't imagine they keep full 30fps videos around for long. Do they?


>And if you want really go all out, most phones — including iPhone — support 5 or even 6 digit passcodes.

iphone supports arbitrary-length alphanumeric passwords.


If memory serves, you can actually set a longer numeric PIN and it will still give you the number pad for longer, but quick to enter passwords. Alphanumeric are obviously more secure but noticeably slower to enter and IMO it's nice there's a middle ground.


yes, this is true. My PIN is ~9 characters, digits only.


The Surface Book has had facial recognition for almost two years now and it is actually impressive. Short of a full latex mask that has padding to shape the wearer's face it is pretty impossible to break into it. We did tests with iPad pictures and depth options, straight camera prints and a 3d printed model. None of it worked. It is also not easy to access latex custom latex masks.

I feel my only issue with FaceID is that when you are in handcuffs all it takes is for the phone to be held in front of you. It will be interesting what safety regulations are used to prevent illegal entry by police or captors.


Maybe it permanently disables the phone if it recognizes you and you have your eyes crossed? Or tongue stuck out? :)


Unlocking your phone with face/finger prints is for many people much better than their 1234 pin.

As far as I understand, the face unlock follows the same rules as touch, where <48hrs of no use requires the PIN/password, and the newly announced 5 tap the power button to trigger the PIN/password requirement.


I don't understand why what's essentially a login (fingerprint, face, dna) is considered a password. It simply isnt.

And I don't understand why I cant (on Android 7) combine fingerprint and then PIN/Pattern to unlock my device. It's mind boggling and completely stupid.


Biometric data is not a username. Biometric data is also not a password.

Biometrics is biometrics. I like to think of it sitting between a continuum between "username" and "password".

I might like a setting to require both a Touch ID (or Face ID) and a passphrase to unlock my iPhone. However, Touch ID has flaked out enough times for me (not accepting my fingerprints) that I probably wouldn't like to risk it in practice.


Biometrics is closer to a username.


Why? I am not terribly upset if someone has my username, but I would be very concerned if they had reproducible biometrics of mine (fingerprints, facial, etc).


Usernames are fixed values and are generally public. Biometrics are also fixed values and are generally only slightly less public. They're both identifiers.

Passwords can be changed and are secrets. They're authenticators.

The difference between them is exactly the difference between identifiers and authenticators. Misunderstanding this difference causes tons of issues, in a wide variety of situations. The most notable one recently is probably Social Security Numbers being used as both, which leads to identity theft.


Because biometrics are usually relatively publicly accessible information. Passwords aren't. You're arguing reproducibility. Well, your face can be replicated by a picture you put on Facebook, fingerprints are left everywhere you go.


Where would Genital ID fall on your continuum?


Perhaps I was too flippant. Point being, the “public availability/replicability” of the biometric would seem correlated to the point on the username->password continuum.

This will probably matter less once our future devices can interact with our sci-fi personal nanites, or rfid implants in the meantime.


The 99% use case for having to unlock a phone with TouchID / FaceID / 4 digit passcode is to prevent people from snooping on your phone (think children, coworkers, strangers, thiefs) within a relatively short period of time. If your phone is stolen, for example, you'll probably contact Apple and remotely disable it within a day, probably sooner.

I don't think TouchID / FaceID / 4 digits are intended for the 1% use case: preventing malicious actors with long term access to the phone from getting in (think police, government agents, etc.).

As a result, most people I see have 4 digit codes on their phones, without the "10 wrong passwords erases all phone data" option enabled. That alone is probably more insecure than FaceID / TouchID over the short term. Sure, a 3D map of your face or your fingerprint is not a true password.. but for this use case, they're a perfectly fine substitute while being more convenient.

If you want long term protection, then you should use an alphanumeric password for your phone with the "10 wrong passwords" option enabled. But most people don't want or need that.


> And I don't understand why I cant (on Android 7) combine fingerprint and then PIN/Pattern to unlock my device. It's mind boggling and completely stupid.

This. I've been looking for a way to have two factor unlock (Fingerprint + PIN) for a long time.


Sure, maybe a 8 digit random alphanumeric is better to protect against government agencies but if you're trying to protect against friends/family/co-workers it sounds like a win for the user. Besides, you can always press that power button 5 times and boom, you've entered password only mode.


The suggested alternative is to use passcodes, but then there's no way to unlock your phone without making the unlock code plainly visible.


I mean, it's less plainly visible then your face, or even pictures of your face.

I bet you there's an algorithm somewhere that can take a picture of your face and turn it into a 3d model. Then you can take that model, 3d print it, then use it to unlock your phone.


I believe this was the exact attack they were talking about preventing with the "we worked with Hollywood mask-makers" line.

Also, as far as I understand, the demo videos are misleading: these systems (this and Windows Hello) are taking infrared pictures of your face, not visible-light pictures. From their perspective, you look like a (3D depth-tested) network of hot capillaries. This is 1. rather hard to recreate with any amount of sculpture-work, and 2. still identifies you "through" things like foundation/concealer creams.


A few years ago, Apple bought Primsense. Primsense made the sensor in the original Kinect. They've now integrated this into their Face ID system, or at least that's what the keynote suggests. It projects a known random dot pattern and an ASIC does some image processing to figure out the depth in the image (nice patent, by the way).

So they have two bits of info: the 3D reconstruction and an infrared image with/without dots, they also have a colour image using the FaceTime camera. I would be surprised if they didn't use more information than just the point cloud. Biometric security at airports uses a similar system, you can see the laser light when yo go through the e-Passport gates.

https://techcrunch.com/2017/09/12/iphone-x-basically-has-a-k...


Obviously we do not know the exact workings of Apples FaceID, but there certainly is potential for making it very difficult to replicate. But infrared by itself does not reveal much. UV on the other hand, creates an entirely different picture, see this as an example between the two. The first picture, the right side is IR light, the second, the right is UV, and the one revealing hidden features of the face.

http://aplus.com/a/jon-simonassi-infrared-photography-freckl...


I'm reminded of the excellent James Mickens piece on security, where he mentions the difference between (IIRC, don't have it in front of me) securing against an angry ex, and securing against Mossad.

Sure, there are entities out there who could probably crack this if they were inclined to target you. But is that truly -- and don't be hyperbolic here -- a thing that you worry about on a day-to-day basis due to actual experience of having been personally targeted by those entities? And is it never acceptable for a consumer device to be only "secure against the angry ex" versus "secure against Mossad"?


So others can be lazy like I wanted to be and not have to look it up: https://www.usenix.org/system/files/1401_08-12_mickens.pdf (warning: pdf, if that matters to you)


Even worse, your ex probably knows your passcode if you've ever opened your phone in front of them.


I think the difficulty of making that 3d model is a bigger barrier than the lower visibility of a passcode. On my Android phone, it even helpfully highlights each digit as I'm entering it on the lock screen.


"It hurts a lot more to change your face than to change your passcode"


Yeah, but there's not much point in having a passcode if it gets compromised on every use.


Why you should never unlock your phone with your face...

... basically because in less than 3 months, you will see HN article of someone posting some sort of 3-d photo of your face stuck to a watermelon, and showing you how to fool the IOS and unlock your phone anyway.


"And if you really want a random number, paste this into your browser’s JavaScript console..."

I would suggest rolling dice instead. The PIN that produced would be truly random.


Especially since the script given will produce a pseudorandom number in the range [1000, 9999].


RNGs shipped in browsers today are pretty good, and there is also crypto.getRandomValues().


And some dice I've had were quite obviously not random. :)


That's why I use both types of authentication / identification, and having more than 1 option is a good thing.

When in a situation where there are higher risk, such as going through certain airports or borders, or in a situation where confiscation of phones are high, just disable biometric authentication temporarily and use pin/password.

But in normal situation, where risks are low, re-enable the biometrics, for convenience. I like the ease and convenience of using phone this way when I'm at home or at work.


I agree but keep in mind that this same principal applies to TouchID, which is what FaceID is replacing. FaceID is so much better than TouchID in so many aspects. Less false positives, it works even if your fingers are wet, and it's a natural behavior to look at the screen.

Both TouchID and FaceID is trying to protect from complete stranger. I know that with FaceID (if it does exactly what the video suggests) it will be a harder challenge to unlock.


The author suggests tiers of security, where the biometrics would only unlock the first tier. Isn't that how it already works? If I'm not mistaken, certain operations like changing the Touch ID fingerprint or passcode require you to enter the passcode first. And changing the iCloud information requires the iCloud credentials. Isn't that exactly what the author is asking for?


I use face unlocking (just the builtin(?) Android one) and I am well aware that it isn't secure. I use it because I'd rather have an insecure way of locking my phone, then no way of locking my phone. I try not to keep to important of info on my phone, and also I need my phone a lot when it isn't safe/easy to unlock it, and just pointing it at my face is pretty simple.


With all this talk about the 5th amendment and pictures fooling devices, I'm reminded of this classic scene from the 5th element:

https://www.youtube.com/watch?v=nah_3vO0uhM

"That's a very nice hat."


I don't know what's up with his sample JS, a simple "Math.floor(Math.random() * 9999)" would be better.


My only guess is that they assume people will be confused by 3-digit (or fewer) results. They could have coerced it to a string, though, which yields pretty good results:

(Math.floor(Math.random() * 10000) + 10000 + "").substr(1)


I wondered the same thing, but this guarantees not starting with a 0, which I could imagine breaking some people's idea of what a passcode should look like.


Just Realized : Face recognition unlock : Biggest Security Scare

- Case 1 : Imagine crossing security check or border crossing. Guards just take your phone and point it to you : UNLOCKED . No need to resis to give passwd

- Case 2 : drug the activist and point unconscious victim ! Voila !

- Case 3 : Steal the phone, and change the cover and flash it in front of the real owner !

could go on and on ...


Case 1 and 2 are covered with FaceID - you have to be actively looking at the phone, drugged/eyes closed/looking away/etc. won't cut it.


> drugged eyes won't cut it.

This seems extremely inconvenient for binge drinkers* that need to Uber home or call a friend.

*or light drug users


You can always back up to your password (at least, with TouchID).


Case 1: "Look at the phone straight-ahead with your eyes or we'll beat you with the rubber-hose again"

Case 2: Hold open the eyelids with tape. Even if the eyes have rolled-back in their sockets they can be re-positioned with some manual adjustment enough to get the system to work.


Isn't Case 1 an attack on every possible method?


Case3: Give me you password or I beat you again.

How is this different?


For case 2 - I believe your eyes need to be open for this to work.

For case 1 - you can disable faceID prior to crossing borders.


Not sure why you're being downvoted.


Biometric identification works just fine. That's what we use every day between people to identify each other. Machines aren't just very good at it, yet, but Face ID is a step in the right direction.


> Let’s take a step back and consider the ultimate biometric identifier of you as a person: your DNA.

It's not. DNA is exactly the same for identical twins, whereas their iris, faces and fingerprints are different.


> Today Apple announced its new FaceID technology. It’s a new way to unlock your phone through facial recognition.

This line makes it sound like android hasn't had this feature for years.


I have yet to see a single Android phone where this actually functions correctly. We will see if Apple did a better job. It's not who is first, it's who makes something that works reliably.


Samsung's S8 has an iris scanner that works surprisingly well.

But I've noticed that it works well on some people and total trash on others.


Did you check the video in that article, the S8 scanner can be easily fooled.


There's a funny double standard in mobile reporting. It's incredibly common to see non-Apple phones being called "iPhone clones" but I'm seeing very little mention of the iPhone X looking eerily similar to existing bezel-free-with-cut-out designs like the Essential PH-1 and Aquos S2.


Facile comparison is facile.


Moral of the story, people are bad so don't store anything sensitive on a tiny little device that can be easily taken from you and broken into 100 different ways.

If you insist on storing your leaked NSA documents or whatever on your phone, then you just have to accept that you're exposing yourself to a lot more risk (real or imagined) than you would have if you didn't store that stuff on your phone.


What percent of ios users never create a passcode / pin ? I'm just thinking that if face id is defaulted on it would be better than the crowd that never create a pin. I agree it is better to have a pin enabled than faceid and even better both.


I think I remember them saying 50% back when TouchID was announced


And people wonder why I don't put pictures of myself on the Internet.


Perhaps Apple could add gestures to the facial unlock. Example: If I raise my left eyebrow twice in rapid succession, my iPhone would enter passcode-only unlock mode, or do a factory reset, etc.


The Apple copy about this says it all... 'Your face is now your password'.

Since I definitely wouldn't wear my password around all the time in public, that sounds like a bad idea.


I don't require any password on my phone. The only reason I put a fingerprint on it is to prevent pocket dials. And even with that, it sometimes almost seems to dial 911 by mistake.

All I need is a way for the screen to ignore input (when it turns itself on) unless I activate it with the power button.

Are there really that many people who truly need very high security on their phones?

Seems to me most people just want to deter casual snooping.

Personally I would rather that any app that has high security requirements would secure itself and not require that the entire phone be secured.


If you have email set up on your phone and someone steals your phone without a passcode it’s pretty much game over for all your online accounts. Also defeats two factor auth in case you have that on your phone.

Also a lot of services let you reset your password via SMS etc.


For SMS all you have to do is put the SIM card in a different phone, locking your phone does not help.

The only accounts you can change the password with a simple email are low security accounts without much of worth to steal.

I'm sure there are people with more stringent security needs, but most people just need to deter casual snooping and don't need to stop hackers.

I personally do not like having the same security on everything. I find that detrimental. I think phone makers should do a better job of having multiple tiers of security.


You can set a separate pin code on the SIM and if everything works as designed, the SIM should be useless without the PIN after losing power.


IMO, the real benefit of biometric identification against passwords is that nobody can spy you while you are unlocking your phone.


This is actually an excellent article on why you should never trust any software produced by Samsung.

Beyond that, it's pathetic click-bait.


I feel bad for those that run this person's random digits code and gets one of the numbers in his table.


Never trust an article that tells you to paste something somewhere (specially a console)


How about a "lock my phone" facial expression?


Can't wait for a guide, first things to do on your first boot. 1) Turn Off Face Unlock, this feature consume memories and stores large training data for your face and also will drain your battery.


Not exactly correct, as this data is stored in the separate secure enclave, just like with TouchID today.


It's such an obviously bad idea, strange to see Apple put it in.


"Historically it was unsafe to fly in an airplane so you shouldn't now"


What's changed?


A good thought. Just think you get robbed. The robbers can get the phone easily unlocked withoout even asking you.


If the robbers can make me look a the phone, if they ask me, I'm going to give them my pin anyways. Its not like I'm planning on resisting giving my pin out to a robber.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: