While technically true this is false in practice. While they can't force you to provide your passcode they can force you to unlock your phone. Francis Rawls has been in prison for two years now over refusing to decrypt a hard drive. The same principle applies to phones. If a judge finds you in contempt-of-court they can imprison you indefinitely.
The "police can't force you to give up your passcode" misconception stems from a case in Virginia from 2014 , and while that may still be the case in Virginia, it does not mean you can just say "my phone is locked with a passcode, fuck off cop" in every other jurisdiction.
s/world/country/ fixed that for you.
If you are unwilling to give something you know to someone, no amount of force can take it from you. Had that drive been encrypted using facial biometrics, they could have just knocked him out, glued his eyes open, and taken what they wanted.
What works, and what has been deemed legal, as you probably already know, are not mutually exclusive.
Technology can easily be used to encrypt a hard drive to reveal different things for different passwords. TrueCrypt does it.
Plus you can have cryptographic keys stored with friends or beacons that signify you are safe. For example you hide files on your phone before a flight, and to unhide them you need your host's wifi at your destination.
Until the friend or beacon gives the ok, you can either have your phone LOCKED or have all sorts of files that are encrypted and HIDDEN. The computer can be unlocked but won't show those files.
Why isn't this technology widespread? YOU CANT WHACK EVERYONE!
Requiring a person to unlock a device is not prohibited by the Fifth Amendment simply because the device contains incriminating information that would otherwise be inaccessible to police.
If the police have a valid warrant to search your safe, you are generally required to unlock it for them, even if the safe contains evidence that incriminates you. If you are issued a valid subpoena to produce certain documents in your possession, you are generally required to produce those documents, even if they incriminate you. Compelled decryption of hard drives is fundamentally no different.
It is true that the act of unlocking the safe or producing those documents is itself testimonial in the sense that you are conveying the fact that you know the combination or possess those documents. But under the "foregone conclusion" doctrine, if the state already knows that implicit testimony, then it is not protected by the Fifth Amendment. It's obvious that Rawls knows the password to the drives.
There are legitimate concerns about how search warrants should apply to electronic devices. However, these are Fourth Amendment issues, not Fifth Amendment ones.
If you're interested, Orin Kerr from the Volokh Conspiracy has written several articles about compelled decryption, including with respect to this particular case [1, 2, 3].
It's kind of counterproductive to assume a singular agent "the state" in this context - all legal checks and balances essentially rely on multiple, separate, competing agents of the state controlling each other, separation of powers and all that. If police and all the judges are both on the same corrupt side then there's nothing stopping them from convicting you for murder of Abraham Lincoln and locking you up indefinitely for that, but we're working off of the assumption that this is not the case.
Planting a gun on a defendant is much harder to do than planting a USB drive. If it really was the defendant's, they probably have munition for it, there's biological contamination etc.
It seems to me there should be an onus on the prosecutor to prove that they defendant has the key and isn't giving it up before the defendant is held in contempt of court: "Here's a video showing the defendant access the banned booked on the USB drive, she won't give up the USB-decryption key!"
Read/write access is basically physical access. Anyone with enough resources could accomplish physical access.
It's possible I'm being paranoid, but it seems like most of law is based on series of assumptions. It isn't a purely logical idea, which is why we have so much fun arguing about it. It is fundamentally the hope of writing down logic in a language that doesn't, unambiguously, contain it.
In this particular case the authorities are explicitly arguing that there's no good reason to use a different process for passwords as they currently use for safes, and this (requiring the defendant to unlock it) is the standard procedure, not drilling the safe open.
Sure, to technical folks like us, but notice I said "in the eyes of the law". Furthermore, police can not open a safe without a court order, so guess your reply was a bust all around?
I know they can't open a safe or for that matter a door without a court order. The point was that the comparison between forcing open a safe and forcing someone to produce a passphrase was meaningless because the apparently treacherous question of compulsion to testify against yourself wouldn't be tested when I drill could do the job.
I am not a lawyer but I know enough to know that most people in most discussions are full of it and know little. What actually is mysterious is why people believe that their nonexistent expertise adds to the discussion.
Imagine if the matter were technical and a bunch of non tech people, say the kind who get confused about ram and storage, referring to both as memory,or call the entire thing the cpu were volunteering different insights into the question at hand. It would be useless in a funny sort of way.
What's worse is that trustworthy deniable encryption - which would solve this - is practically non-existent now that TrueCrypt is gone.
Something like "siri delete my iphone code alpha nine x."
Or even an undetectable command like "silly sausages" for example.
Might that stop a judge from being able to send you to prison indefinitely?
Disclaimer: I never studied law
What if it only erases a certain file or partition, so the phone still contains some personal data, but it's like TrueCrypts's plausible deniability?
Do you have a reference for this?
I suppose depends, criminal right thing is to help solve a crime or something I don't know.
Seems similar to using VPN in China. Is it illegal to protect/use your privacy.
People have got to stop martyrizing this guy. He's in jail because the prosecution got a fortuitous decision that says they can hold him as long as they want until he coughs up a password. They aren't fishing for evidence, nor have they used this trick on anyone else. If he went to trial on the evidence already in public, the guy would hang (metaphorically). There is no significant doubt in anyone's mind that this drive contains CP.
But the principle is the important part you say? Sure. Make a principled argument. Don't cite this guy.
It seems to me that they've intentionally chosen a morally-objectionable individual upon which to build a legal precedent, as anyone who speaks in his defense can have his crime thrown back in their face.
We've heard a lot about these so-called "hashes" that prove the presence of CP. It's also pretty easy to make a probabilistic proof about the likelihood of a hash collision between two non-identical files. I would venture that if you can show mathematically that you're 99.9999999% certain that the files are CP, that would qualify for "beyond a reasonable doubt".
It's incontrovertible that if he's committed the crime for which he's been accused, he should be jailed. If they've already proven that crime, then why isn't he already serving his sentence? Or does the prosecution's case rest upon this one piece of evidence, and if so, is he therefore required to testify against himself in order to avoid indefinite detention?
You call it a fortuitous decision, I call it precedent.
If Jason Bourne is after you that's probably true. If you're worried about border security, that's maybe true.
But for most people, the lock on their phone isn't protecting them from the government, it's protecting them from nosy relatives, a pick pocket, or the guy that finds the phone you left at the bar, or their 4 year old. None of these 'attackers' will ever be sophisticated enough to defeat the biometric protections on an iPhone.
I think people need to adjust their security policies to reflect the actual security threats they're likely to face, and for most people Touch ID or FaceID are more than adequate.
These features are change; not a straight improvement. They have different pros and cons compared to a passcode.
The biggest concerns for the iPhone (or others) then are things like viewing angle, sunlight, etc...
Also, if I were an identical twin (only 0.3% of the population) I would be a bit unhappy that my brother/sister could post anything they wanted on my IG/FB/SN.
As with any security break, the first research prototypes may sound sophisticated, but they might not be that far off from practical mass-production.
Sounds like you work for Equifax. (=
Look, real security threats are out there -- even if you don't want to acknowledge them. Phones have too much sensitive data, photos, bank accounts -- now the ability to pay via text message. It's just short-sighted to think the only threats people should be worried about are their kids or neighbours.
You could be shot at random too, but I'm betting you didn't wear a bullet proof vest today, unless you have cause to think someone is determined to harm you.
And don't I wish that my credit info requires a fingerprint to access. That'd be a big step up!
TeamViewer accounts with weak passwords were being hacked en masse with scripts trying to visit paypal.con and transfering funds. If you had passwords auto-saved in your browser they could get in to make the transfer.
FaceID is better than 99.9% of what humans actually do in the real world.
Though people have been raising similar issues about biometric identification for years. See this article from back when TouchID was released 2013, titled Fingerprints are Usernames, not Passwords. http://blog.dustinkirkland.com/2013/10/fingerprints-are-user...
It's more like fingerprints are door locks. Any determined thief can get around it. But it protects you from people who aren't really all that determined. And for most people door locks are sufficient. But if you are a major crime lord, protecting something extremely valuable, or just really into security then door locks are not enough.
If your door lock is compromised, you can change the key. If someone steals your password, you can change the password. If someone steals your fingerprint, you can never change your fingerprint (same with your face).
The other stuff is dead-on: its a "good enough" security measure for phones. But as a security practitioner, the biggest problem IMO is that Apple using TouchID and FaceID is giving the general public the wrong idea about security. Apple claims that these innovations are "cutting edge" security, and so consumers buy into this and then also use fingerprints to secure things like their bank accounts, work logins, password vaults (this is a big one - someone steals your phone and you use your fingerprint to access your LastPass account, which has all of your passwords in it? And your phone is also your 2FA device? You're screwed.) etc, where they really aren't "good enough" at all.
I've worked at companies where we disabled fingerprint logins on certain devices because highly sensitive info is held on those devices, and fingerprints just aren't secure enough to protect them. Then we get yelled at by people from the company because "Apple says fingerprints are the best for security, why aren't you letting us use them?" It's a pain.
The second biggest flaw being your phone is covered in your fingerprints!
Not quite true from the phone's perspective. Most people have nine backups to fall back on if they really need to.
Because you expect repeated attacks from the person who stole your fingerprints? Who are you, James Bond?
Imagine your fingerprint data is leaked to hundreds of hackers.
Yeah, I'll risk it...
Individuals who had their fingerprints stolen in that hack can now never use fingerprint readers with any reasonable confidence, since now all a hacker has to do is search that person's name and pull their fingerprint from one of aforementioned databases.
> fake your fingerprints with a cast
Fingerprint scanners like those on phones have been shown to be able to be fooled by using $10 worth of office supplies and some play-dough. It's not like we're talking mastermind levels of intelligence to do this stuff.
Of course, all of this completely ignores the fact that your phone likely already has several copies of your fingerprint already on it since you touched it, so it's not like someone hacking your fingerprints is even necessary. That's an entirely different reason of why fingerprint security is abysmal, though.
2) google "touchid hack" there's videos on YouTube.
3) not super likely as usually you'll only find rough partials, but as previous poster mentioned, there has been government hacks that have leaked biometric data.
(That said I never had a phone lost or stolen. People who tend to have them so might feel differently).
Neither Touch ID nor passwords keep determined intruders out. If someone really wants to know what's on your phone, they will arrest/kidnap you and threaten you with prison/violence.
The problem is that while the actual ranking from least secure to most secure is "nothing < touchid/faceid < passcode", Apple's marketing and implementation gives people the false impression that its "nothing < passcode < touchid/faceid", which is bad for security.
So Touch/FaceID isn't better than a good passcode, but maybe it's better than a crappy passcode.
I noticed a distinct improvement in the speed of the TouchID unlock going from an iPhone 6 to a 7, which pretty much reduced all friction to me using it. Apple's marketing fluff suggests FaceID will be "twice as fast" as TouchID.
Your point stands, of course.
At what point is stealing a fingerprint, retina print, or face going to be economical enough for the thief that this would be an actual valid concern in 99% of use cases? Both FaceID and TouchID need to read a living person with a pulse in order to authenticate. You can't just take a printout of a fingerprint and drop it in. This is a really heavy lift to try to jack some random person's phone. Unless you're securing State Secrets or occupy rarefied enough heights that you have a Swiss bank account I don't really see anyone bothering.
>and so consumers buy into this and then also use fingerprints to secure things like their bank accounts, work logins, password vaults
Which bank accounts are taking fingerprints? Do you mean people's banking apps on their phones? In order to get to that they would need to steal both your phone AND your fingerprint. If a thief is this enterprising your info. is lost anyway. And again, they would need an extremely high fidelity reading of your fingerprint and the ability to reskin a living finger with it. And they would have to execute all this before you get to an Apple Store or a PC to remotely shut it down.
>Then we get yelled at by people from the company because "Apple says fingerprints are the best for security, why aren't you letting us use them?" It's a pain.
This often happens when someone shoves policy down people's throats without explaining themselves or getting buy-in from their clients. This is a communication skills problem, not an issue with biometrics.
For the average person who is just securing their phone that only stores pictures of their cat, this isn't a concern, but that's far less than 99%. For pretty much anyone who is logged into their work email/VPN via their phone, or is using fingerprint scanners to secure their work laptop, this is a very real concern that I have seen exploited a few times in the real world.
> Both FaceID and TouchID need to read a living person with a pulse in order to authenticate.
TBD with FaceID, but with TouchID this isn't the case. You can defeat TouchID with $10 worth of office supplies and some play-dough.
> Which bank accounts are taking fingerprints? Do you mean people's banking apps on their phones? In order to get to that they would need to steal both your phone AND your fingerprint.
Since your phone literally has your fingerprint left on it from when you touched it, this isn't really a difficult task.
And as I mentioned, it's even worse if you're one of the people who uses a password manager on your phone that is also locked with fingerprint. Then, every account you have is now compromised. And even if you're using 2FA, your phone is likely your 2FA device, which the thief also has.
> This often happens when someone shoves policy down people's throats without explaining themselves or getting buy-in from their clients. This is a communication skills problem, not an issue with biometrics.
No, it is undeniably an issue with biometrics (and the way they're treated). Training and awareness (communications) is one of the primary problems that any security implementation will try to tackle, but it's just made more difficult to do that when Apple is pushing falsehoods like "TouchID is the most secure thing ever!" in all of their marketing materials.
The most important factor of authentication protecting a mobile device is just possession of the device. Fingerprint or face unlock adds what so far in practice seems to be a decent layer of security. Eventually I expect that it will be improved a lot by greater situational awareness on the part of the device: you won't just have to steal the phone and fool the 3d camera, but do both without ever letting the phone see, hear, or otherwise sense anything suspicious. Which is probably getting into mission impossible territory in most situations.
But even without that, in practice I think your corporate secrets would be considerably better defended by something like face id and device identity than by, say, a password and a regular old 2fa token that are both easily and simultaneously and remotely compromised by sending the target an email from yourcompany-itdept.com asking them to log in.
Sure they can. Haven't you ever seen a cop show where the detective tricks the suspect into drinking from a cup of coffee so they can lift the suspect's fingerprint from the cup?
"Hi John, nice to meet you! * shakes hand *"
I now have John's fingerprints from where he touched me when he shook my hand.
"Hey John, can you send me a selfie?"
I now have a picture of John's face and possibly his iris.
Hell, I bet it won't be long at all until someone finds a way to use the iPhone X's own "TrueDepth" camera to record a 3D scan of the user's face which can then be used to fool FaceID.
You probably don't worry very much that your loved ones have been replaced by impostors, and the reason is not that their appearance is secret! It's just that fooling your face, voice and other "biometrics" without making you suspicious would be, depending on the situation, somewhere between technologically impossible and way more expensive than it would be worth.
A secure biometric is one for which spoofing the sensor is as difficult or expensive as compromising the device hardware some other way. I agree with you that touch ID doesn't quite meet this standard, largely because device hardware has gotten much more tamper resistant in recent years! Hopefully face ID will be better. I can easily remember when it seemed absurd that normal consumer devices would ever have a chance of resisting compromise by a sophisticated adversary that had the device in their possession!
And here lies the problem. Apple treats them as if they are.
"Your fingerprint is one of the best passwords in the world" - Apple during the keynote when they introduced TouchID
"Your face is now your secure password" - Apple during yesterday's keynote introducing FaceID
The mistakes you can make by misunderstanding biometrics seem like more of a problem for system designers, who hopefully don't get their whole understanding of security from Apple keynotes.
In my experience as a security consultant, one of the biggest problems (and it's a very big problem) we face is that average users lack training and awareness of good security principles. It's really bad to rely solely on system designers for your security. Even if your system designer is 100% effective, it just takes one unaware user to do something bad such as give their password over to a phishing call and you're screwed. And if for nothing else, training and awareness is necessary because without it, you get users kicking and screaming when they don't understand why you've implemented certain security features, which typically means you end up implementing less security to avoid the kicking and screaming.
And just like in your average security training and awareness session you'll have a lesson on "don't give your password to someone on the phone, even if they claim to be your IT guy", we also have lessons on "fingerprints are not passwords, and you should not use them as such", but this is hard to get through people's heads when Apple's marketing material says otherwise (as shown in my previous comment).
The FaceID marketing is the same. The iPhone X advertisement released today says "your face is now your secure password". The website says "Face ID is so secure you can use it with Apple Pay". During the keynote today they actually even said up until FaceID, TouchID "was the gold standard". About FaceID they said "FaceID is the future of how we will unlock smartphones".
You'll note that nowhere in any of it's materials or even in the deep recesses of it's website does Apple acknowledge that even though Face/TouchID is great, it's still not as good as a strong passcode. The closest they come is during the key note they acknowledge "nothing is perfect, not even biometric", but you'll notice that even this statement subtly tries to imply that biometrics is the highest security available ("not even biometrics").
You're conflating every fingerprint scanner with the Apple's implementation of TouchID, which is far more secure than the check-the-box-to-win-a-government-contract stuff that's been built into most laptops. If it's that important, use a biometric print AND a PIN. Easy enough no? At least way easier than requiring an absurd password requirement that people wind up writing into a post-it-note anyway.
>You can defeat TouchID with $10 worth of office supplies and some play-dough.
And by having a person physically press the correct finger onto your obvious fingerprint-stealing device. . .k But if someone has the power to compel you to do that, they have the power to compel you to just put your finger on your phone for them.
>Since your phone literally has your fingerprint left on it from when you touched it, this isn't really a difficult task.
This is even more involved. This involves having to lift the print with a high fidelity scanner and create a latex mold of it. What are you securing on your phone where this is a concern? And what do you think they're going to do when there is a face-scanner or retina scanner? I suppose they could just clone you, wait however many years for the clone to mature, and then use it.
> it's even worse if you're one of the people who uses a password manager on your phone that is also locked with fingerprint.
Maybe if fewer services forced people into using inane and impossible-to-remember passwords and just relied on biometric authentication instead folks wouldn’t need password managers that are so easy to unlock. Not everything needs the level of security of my bank-account, and when ever service a person interacts with wants to pretend they're a bank or credit card then it makes people take their bank or credit card's information less seriously than they need to out of sheer fatigue.
Security should be about fostering secure behaviors and culture in your users, not just ramming the most technically secure set of rules at people regardless of the context. That just makes people behave in insecure ways, like what you're talking about, because you haven't bought them into the importance of the big picture.
>it's just made more difficult to do that when Apple is pushing falsehoods like "TouchID is the most secure thing ever!" in all of their marketing materials.
I don't understand how you derived THAT from this:
>Much of our digital lives is stored on our Apple devices, <b>and we recommend that you always use a passcode or password to help protect this important information and your privacy. Using Touch ID on your iPhone, iPad, and MacBook Pro is an easy way to use your fingerprint instead of a password for many common operations.</b>
No, I'm not. TouchID is the most popular implementation, and because it's present on every iPhone (which is the most common device to be a work phone, and thus also connected to work email and work networks), and because TouchID is also insecure, thus arises the problem.
> If it's that important, use a biometric print AND a PIN.
This is not possible on the iPhone, and wouldn't solve the problem anyway: consumers are under the false impression that fingerprints are the best security available, and they become frustrated to learn that Apple has been lying to them when corporate IT tells them fingerprints actually suck and they can't use fingerprint locks (or have to use fingerprint + something else) if they also use their phone for work stuff.
> And by having a person physically press the correct finger onto your obvious fingerprint-stealing device. . .k But if someone has the power to compel you to do that, they have the power to compel you to just put your finger on your phone for them.
What? No, you don't. You're just making stuff up now. You can steal someone's fingerprint by simply having access to something they touched, and then you can duplicate it with $10 worth of office supplies.
> This is even more involved. This involves having to lift the print with a high fidelity scanner and create a latex mold of it. What are you securing on your phone where this is a concern?
Network access to a corporate environment that has millions of SSNs, credit card numbers, etc. You think that a few hours of fiddling around with a latex mold is "too much work" for this? Think again.
> Maybe if fewer services forced people into using inane and impossible-to-remember passwords and just relied on biometric authentication instead folks wouldn’t need password managers that are so easy to unlock.
You miss the point. This wouldn't solve the issue at all, and would actually worsen it. Fingerprints are inherently insecure. Using fingerprints for more accounts is, thus, more insecure.
> I don't understand how you derived THAT from this:
I "derived" it from years of experience working as a cybersecurity consultant where at every company someone complains that "Apple says it's secure, so you must be wrong". Watch the keynote. Apple refers to TouchID as "the gold standard", "one of the most powerful passwords in the world", says "it is the most advanced technology", calls it "very high security".
So your VPN isn't adding any extra layer of auth? This doesn't seem like a TouchID problem. . .
This is also a security design problem. You shouldn't be transmitting sensitive information via e-mail. If I want sensitive data stored in your e-mail, I'd start with a phishing attack long before I decide that physically jacking your phone and coming up with a complicated finger-print stealing process is the way to go.
>This is not possible on the iPhone,
To unlock the phone. You can add addition layers after the iPhone auths all you want.
>consumers are under the false impression that fingerprints are the best security available, and they become frustrated to learn that Apple has been lying to them when corporate IT tells them fingerprints actually suck
Heh. If your clients trust Apple's marketing more than their own IT people this again speaks to me of a severe communication problem among the IT people.
>You can steal someone's fingerprint by simply having access to something they touched
Not with high enough resolution to reliably fool TouchID in few enough attempts to keep it from passcode locking you out. You're fixation on worst-case scenarios where your adversaries benefit from multiple passes of blind luck doesn't make for great or realistic risk-assessment.
>Network access to a corporate environment that has millions of SSNs, credit card numbers, etc. You think that a few hours of fiddling around with a latex mold is "too much work" for this? Think again.
And you're not monitoring for suspicious activity or any additional access control on a data source that has all that sensitive information? You're just letting people mosey on into it with just their phones without so much as a warning flag going up somewhere?
>Fingerprints are inherently insecure. Using fingerprints for more accounts is, thus, more insecure.
This is both highly simplistic and wrong. For one thing, passwords are also inherently insecure, especially when people write them on sticky notes and put them under their monitors. Secondly, not all accounts need maximal security. Not all activities within an account need to give people access to the maximal extent of their privileges. Insisting on going all out on every single thing people try to do fosters insecure habits and insecure system design. You're making the problem worse.
>Apple refers to TouchID as "the gold standard", "one of the most powerful passwords in the world", says "it is the most advanced technology", calls it "very high security".
None of which is false for the use cases they're talking about. You're talking about access to sensitive PII, which Apple did not tell you to gate behind TouchID. I also have years of experience in Infosec and setting the record straight on things takes all of 30 seconds of explanation and taking the time to understand their business context. All it takes is to not treat your clients with contempt.
You're vastly underestimating how valuable access to a person's phone can be. It's not just about quickly wiring money or stealing state secrets but also about building blocks for social engineering campaigns, ad/app fraud, extorsion and all sorts of different things.
And the petty thief who steals your phone doesn't need to have the tools to spoof the biometrics. There just needs to be some criminal organization that does and that's willing to pay petty thieves for stolen phones.
And have a pipeline that can buy and move stolen phones fast enough to crack them before the owners can remotely wipe them.
Amazon would kill for that kind of logistical capacity.
Obviously past events are no guarantee of future, but still — most advisories like this frankly come across as fearmongering.
A judge can allow the police to knock down your apartment door through a warrant. But they can't compel you to speak and incriminate yourself.
Much the same, they can force you to reveal your fingerprint, but cannot compel you to share a password.
Unfortunately this is dependent on your jurisdiction. In Virginia it's been ruled that law enforcement can't force a password out of you, but in federal court and in other jurisdictions (Florida), they can imprison you indefinitely for not revealing your password.
The justification used is that the password itself doesn't incriminate you, it's just a password. The stuff that would be revealed with the password might be incriminating, but that's different.
If you're concerned about security, use the print/face and the password.
A phone is not what you have in the context of this discussion. What you have would be an NFC, for example to authenticate to the phone. A phone is only what you have, when you use the phone as evidence to authenticate into a different system.
In certain circumstances they can, but a phone sensor accepting whatever input it is being given isn't one of those circumstances.
Nothing is ever completely secure and usable. There will be some trade offs.
- Nothing - essential what's on lock (weather, maybe news headlines)
- Face - basic stuff - games, calculator, News apps
- Fingerprint - mail, calendar, text message, browser
- Pass code - banking, settings
A one all seems backward - there are something things I don't want to protect at all (don't care if someone can access) on one extreme, and things that MUST be protected as much as possible on the other extreme.
I get leaking between apps is an issue, and there are other problems around this - but this approach seems more reasonable
And yeah, for some users (my parents) they just want something simple and don't want to deal with this. So face or fingerprint is a lot better than no code, so this is still an improvement
If it is a totalitarian regime, they'll just kill you. If you're ever really in such a situation, a blank phone is probably the worst thing you can give them.
Instead, why not a dummy profile that's complete with user activity, social media presence, and showing active harmless use? Why not multiple profiles?
For the rest of us, those who are not spies traveling in totalitarian regimes, what this means is you can hand someone your phone to let them use it. It means you can let your kid use it and not expect to get it back with problems. You can even make the profiles based on the password, so that it only appears to have a single account.
Realistically, the biggest threat is theft. This doesn't hinder theft protection at all. It can still have the same protections, while just offering additional profiles.
Works pretty well for me, there's a little profile icon in quick settings to switch
Err... I use a Windows phone, even though I'm normally a Linux user. I kinda like it.
I suspect that it's a feature dropped from many/most customized versions of Android.
And where do you suppose this data will come from? Maintaining something of a plausible and active social media presence is not without it's efforts, nor is creating a profile that would stand up to some scrutiny.
If people aren't really looking it won't matter much, but if they are and getting something that seems fake it might end getting you in much more trouble.
Because they're personal devices. And even if they had, 99% of the population wouldn't even know how to begin using them (like they don't have an extra profile on their laptop).
At most phones could use an easy "don't let the person I gave my phone to check some pic see my dick-picks" mode or similar.
Email, for example. Day-to-day our normal authentication should cover what's in my inbox and/or the last few months of messages. A "deep dive" of emails from 10 years ago should probably have a second level of authentication. You don't access them that often. Yet, once your compromised your whole history of emails can get slurped up very quickly.
I pointed out to my wife not to email anything with our ssn to our tax guy. She kind of balked, but I pointed out if in 10 years he's compromised it's probably still in his email and trivial to scan for ssn or tax documents.
It's been years, but I was at a company that switched to an auto-delete policy after 90 days or something. I thought it was compliance related, but I also think they encouraged you to store important messages in a local inbox which would seem to contradict that.
It adds an extra layer of security. Not only that, you get to wink at your phone often as a sign of affection (LOL).
Instead of winks, one might choose to do other facial gestures such as stick their tongue out, do a duck-face, etc.
If you want to protect against police, whatever the reason, then PIN is ok butbut be careful when entering it in public.
The only reason I have a lock on my phone is so if I lose my phone somewhere randoms can't access it.
If they're going to mug me for my phone they can mug me just as easily for my passcode or my fingerprint.
I guess there is an argument for pick pockets but that's a lesser concern for me. Anywhere it's likely to happen my hand is usually on my phone.
2. I can also better hide myself entering the pin
3. As mentioned in other places, unlike a fingerprint or face; you can easily change pin codes.
In an ideal world, facial recognition or touch id would serve as identification in addition to the pin code.
Pins are basically useless to anyone determined, no matter how much you try to hide it. There's no real bullet-proof security except for a really long password, but how many people are going to do that realistically?
If you don't join public WiFi hotspots, this isn't a danger.
Yes pins are not foolproof. They are however still better than facial scans or thumb scans.
For further clarity:
The retina is the thin layer of cells at the back of your eye that pick up light. The iris is the colorful ring on the front of your eye.
A retina scan is thing most people wouldn't experience outside of an eye doctor's office. It requires really close proximity with the scanner, and it's very clear that's it's happening.
Phones, including the Galaxy 8 that the author of the article mentions, use iris scanning.
Yes, your fingerprint and faceprint are irreplaceable. They're kept device-local for more reasons than just Apple Pay. But make no mistake: It's simpler to record you entering your PIN surreptitiously than it is to hack your biometrics.
Are the attacks against it "likely" or "unlikely"? They're clearly aware of the "photograph" and "Mission Impossible" scenarios, and demonstrated visible proof of their time spent ensuring they're refused. If you're under directed and specific attack, that's the best you can hope for from technology! It's not human. It can't magically evolve defenses against humans with time, patience, and hacking powers.
Devices are only safe when a human takes care of them. Your phone, your computer, phone companies, credit bureaus. When you don't take care of technology, someone will eventually exploit it, usually for greed.
Which is more likely:
Someone makes a cool mask device that can hack faceprint, and simultaneously gets sued by Honda Robotics for violating one of their many, many patents on lifelike robotic faces. They use it to hack you, and somehow materially impact your life through hacking your device.
Or, someone hard-resets your phone while you aren't looking, videotapes you entering your PIN confusedly, and then steals it.
A dedicated malicious attacker will always take the second path, because this biometrics crap is useless when you can just get people to blindly enter their PIN as if somehow it's safe to do so anywhere.
TL;DR: Enter your PIN in a bathroom stall, or it's on the security tapes of the mall. If you do this, don't enable Touch ID or Face ID. Problem solved.
iphone supports arbitrary-length alphanumeric passwords.
I feel my only issue with FaceID is that when you are in handcuffs all it takes is for the phone to be held in front of you. It will be interesting what safety regulations are used to prevent illegal entry by police or captors.
As far as I understand, the face unlock follows the same rules as touch, where <48hrs of no use requires the PIN/password, and the newly announced 5 tap the power button to trigger the PIN/password requirement.
And I don't understand why I cant (on Android 7) combine fingerprint and then PIN/Pattern to unlock my device. It's mind boggling and completely stupid.
Biometrics is biometrics. I like to think of it sitting between a continuum between "username" and "password".
I might like a setting to require both a Touch ID (or Face ID) and a passphrase to unlock my iPhone. However, Touch ID has flaked out enough times for me (not accepting my fingerprints) that I probably wouldn't like to risk it in practice.
Passwords can be changed and are secrets. They're authenticators.
The difference between them is exactly the difference between identifiers and authenticators. Misunderstanding this difference causes tons of issues, in a wide variety of situations. The most notable one recently is probably Social Security Numbers being used as both, which leads to identity theft.
This will probably matter less once our future devices can interact with our sci-fi personal nanites, or rfid implants in the meantime.
I don't think TouchID / FaceID / 4 digits are intended for the 1% use case: preventing malicious actors with long term access to the phone from getting in (think police, government agents, etc.).
As a result, most people I see have 4 digit codes on their phones, without the "10 wrong passwords erases all phone data" option enabled. That alone is probably more insecure than FaceID / TouchID over the short term. Sure, a 3D map of your face or your fingerprint is not a true password.. but for this use case, they're a perfectly fine substitute while being more convenient.
If you want long term protection, then you should use an alphanumeric password for your phone with the "10 wrong passwords" option enabled. But most people don't want or need that.
This. I've been looking for a way to have two factor unlock (Fingerprint + PIN) for a long time.
I bet you there's an algorithm somewhere that can take a picture of your face and turn it into a 3d model. Then you can take that model, 3d print it, then use it to unlock your phone.
Also, as far as I understand, the demo videos are misleading: these systems (this and Windows Hello) are taking infrared pictures of your face, not visible-light pictures. From their perspective, you look like a (3D depth-tested) network of hot capillaries. This is 1. rather hard to recreate with any amount of sculpture-work, and 2. still identifies you "through" things like foundation/concealer creams.
So they have two bits of info: the 3D reconstruction and an infrared image with/without dots, they also have a colour image using the FaceTime camera. I would be surprised if they didn't use more information than just the point cloud. Biometric security at airports uses a similar system, you can see the laser light when yo go through the e-Passport gates.
Sure, there are entities out there who could probably crack this if they were inclined to target you. But is that truly -- and don't be hyperbolic here -- a thing that you worry about on a day-to-day basis due to actual experience of having been personally targeted by those entities? And is it never acceptable for a consumer device to be only "secure against the angry ex" versus "secure against Mossad"?
... basically because in less than 3 months, you will see HN article of someone posting some sort of 3-d photo of your face stuck to a watermelon, and showing you how to fool the IOS and unlock your phone anyway.
I would suggest rolling dice instead. The PIN that produced would be truly random.
When in a situation where there are higher risk, such as going through certain airports or borders, or in a situation where confiscation of phones are high, just disable biometric authentication temporarily and use pin/password.
But in normal situation, where risks are low, re-enable the biometrics, for convenience. I like the ease and convenience of using phone this way when I'm at home or at work.
Both TouchID and FaceID is trying to protect from complete stranger. I know that with FaceID (if it does exactly what the video suggests) it will be a harder challenge to unlock.
"That's a very nice hat."
(Math.floor(Math.random() * 10000) + 10000 + "").substr(1)
- Case 1 : Imagine crossing security check or border crossing. Guards just take your phone and point it to you : UNLOCKED . No need to resis to give passwd
- Case 2 : drug the activist and point unconscious victim ! Voila !
- Case 3 : Steal the phone, and change the cover and flash it in front of the real owner !
could go on and on ...
This seems extremely inconvenient for binge drinkers* that need to Uber home or call a friend.
*or light drug users
Case 2: Hold open the eyelids with tape. Even if the eyes have rolled-back in their sockets they can be re-positioned with some manual adjustment enough to get the system to work.
How is this different?
For case 1 - you can disable faceID prior to crossing borders.
It's not. DNA is exactly the same for identical twins, whereas their iris, faces and fingerprints are different.
This line makes it sound like android hasn't had this feature for years.
But I've noticed that it works well on some people and total trash on others.
If you insist on storing your leaked NSA documents or whatever on your phone, then you just have to accept that you're exposing yourself to a lot more risk (real or imagined) than you would have if you didn't store that stuff on your phone.
Since I definitely wouldn't wear my password around all the time in public, that sounds like a bad idea.
All I need is a way for the screen to ignore input (when it turns itself on) unless I activate it with the power button.
Are there really that many people who truly need very high security on their phones?
Seems to me most people just want to deter casual snooping.
Personally I would rather that any app that has high security requirements would secure itself and not require that the entire phone be secured.
Also a lot of services let you reset your password via SMS etc.
The only accounts you can change the password with a simple email are low security accounts without much of worth to steal.
I'm sure there are people with more stringent security needs, but most people just need to deter casual snooping and don't need to stop hackers.
I personally do not like having the same security on everything. I find that detrimental. I think phone makers should do a better job of having multiple tiers of security.
Beyond that, it's pathetic click-bait.