Hacker News new | comments | show | ask | jobs | submit login
Face ID, Touch ID, No ID, PINs and Pragmatic Security (troyhunt.com)
612 points by Artemis2 on Sept 14, 2017 | hide | past | web | favorite | 305 comments

> ...when you do use the biometric options we're about to get into, you're still going to need [a pin] on your phone anyway. For example, every time you hard-reboot an iPhone with Touch ID you need to enter the PIN

This is what has been missing from every discussion of this issue that I've seen so far.

The face scan isn't "insecure" even if you're worried about border searches. Just turn off your phone when you get in the security line! Pin will be required on start.

Pin is also required when plugging into a new computer.

The rest of the time when you're going about your daily life, and are not worried about a government agent spoofing your face or pointing the phone at your face, you can use this nice feature.

Most people will be _less_ secure without it. They don't want to punch a pin every time they want to tap their phone to pay for coffee. So without the face scan feature, they will opt for no security at all.

The reboot/plug-in pin requirements change the discussion quite a bit, but are usually ignored, seemingly so bloggers can state the obvious "but someone can spoof your face!"

> The face scan isn't "insecure" even if you're worried about border searches. Just turn off your phone when you get in the security line! Pin will be required on start.

As far as border searches go, border officers have the authority to request your PIN just as they have the authority to request your thumbprint/faceprint/etc. If you don't give it to them, you can be detained and/or your phone confiscated [1]. Rebooting your phone won't help.

1: https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...

They can request your PIN all they want, but you are not obligated to provide it. They can temporarily detain you but not indefinitely, and the EFF is challenging their authority to even do that. [0]

Personally, I would refuse to unlock my phone. My privacy and upholding civil liberties is worth being detained for a few hours (or even days).

[0] https://www.eff.org/press/releases/eff-aclu-media-conference...

This only works if your lawyer is with you or you're white.

Why would you have your lawyer with you? Better to have them outside of the range of border control agents (remote) so they don't get arrested too.

Does this work for non US-residents e.g. tourist, conference, etc? Or do they politely show you the way out (the country)?

No, if you're just visiting they can ask for whatever they please.

An important point: the would likely confiscate any devices you refuse to unlock. Is it worth that?

Yes. They are supposed to return it eventually, and even if they don't I'm willing to pay $1,000 to protect civil liberties.


Also, the border is 100 miles from the Mexico/Canada borders and 100 miles from the shore. So, if you're concerned it's not when you're entering or leaving the country. It's any time you're in LA, NY, DC, SF, Huston or Detroit. Or any of the other thousands of miles of border.

Common misconception: If you have crossed the boarder, then within 100 miles of it they can search you.

Of course, how you prove you didn't cross the boarder is an open question. But you can in fact refuse the search on that claim. I suppose they may detain you then.

I think you're looking for the phrase `probable cause`. If the officer has probable cause to think you crossed the border, they can search you. I'm not a lawyer, but i think they can search you anyway. It just won't be admissible in court.


to clarify, US law pushes police right up to the edge. there is a preference for false positives rather than false negatives. it's more important to catch all of the criminals than it is to inconvenience some innocent people. The risk of letting one criminal go is much more than cost of detaining a doctor for a couple of hours.

Now, we're in this weird time where that doctor can have 20 years of hippa protected medical records in their pocket that they might be forced to disclose. Historically, that doctor may have some records in a briefcase, but not tens of thousands.

There are inland checkpoints, even on the Canadian border that search people who've been in the country for a while: http://www.washingtonexaminer.com/border-patrol-checkpoint-i....

They can even nab American citizens for drug possession:

"One of the people arrested was a U.S. citizen who fled the checkpoint and led the police on a five-mile chase. The unnamed man was arrested and charged with three felonies, including reckless driving, possessing a controlled substance, and endangering the welfare of a minor."

I read that they are not able to force you to provide a pin.

But even if you're right, this doesn't change my argument. Most people are less secure most of the time in the absence of biometric authentication. Because without it, they will opt for zero security. You can always use the pin in addition, for whatever that's worth.

You know what would be really neat? A different, restricted/camouflaged unlock when you make a slight facial expression that would probably go unnoticed.

regular face: regular unlock right eyebrow raised a tiny bit: hide my sensitive stuff from a casual search*

*and after a few minutes, if I don't deactivate it, start deleting.

Unfortunately, lying (including fraudulent representation) to a federal agent is a crime (https://en.wikipedia.org/wiki/Making_false_statements).

On top of this, iOS 11 introduced an emergency mode that is enabled by tapping the standby button five times rapidly. It is easy to do discretely in your pocket, and it locks your phone by requiring your passcode to be entered again.

One thing to note: It's at least five times, not exactly. Just click that side button until your phone vibrates a bit and it's locked. I mainly see myself using this feature in high stress situations, like going through airport security or being pulled over, so not needing to count is a small plus.

It's also worth pointing out that iOS already has an option to erase your phone when the PIN is entered incorrectly a certain number of times. A lot of companies enforce this if you receive your work email on your device (eight times in my case). So if you are in a situation where a PIN is being demanded there's still a way to keep secure.

Your phone can be easily restored from an iCloud backup once you've got access to wifi. You can always restore from another device or backup if there's data you don't want seen if somehow you can be compelled to do a restore.

I'm not sure how I feel about storing 100+ GB in icloud and restoring it via my data plan.

Local encrypted backups via itunes are an option too, if you are traveling with a laptop.

Is that still going to work? I have the impression that Apple is simplifying iTunes and removing everything but audio and video.

I haven't tried since installing itunes 12.7 (the version that removes the App Store etc), but reports I've read suggest it still backs up everything with the notable exception of apps. These have to re-download from the appstore on the phone iteself presumably now. I'm assuming though that all app data is preserved (one would hope so!). Happy to be corrected if this proves not the case.

From macdailynews:

"iTunes backups are still there for iOS devices, but performing a restore won’t transfer your apps from your Mac, but will instead download them over the Internet from Apple, which is, of course, likely to be slower."

You can achieve the same effect on some Android devices by exploiting the password lock from scanning the wrong finger repeatedly.

If a judge issues a warrant allowing the police to compel the use of biometrics, can a person use the wrong finger? Courts can't compel someone enter their PIN. So can a court compel not only biometric "use" but the correct biometric?

My Android forces PIN entry after 5 wrong fingerprints.

If you refuse to tell them which finger you used, and comply by only pressing whichever finger they ask you to press, they will have a 10% chance of getting in each time.

So if you chose only 1 finger and you chose it randomly, they will have a 50% chance of getting in after 5 attempts.

And hope they're biased against the middle finger

You've always been able to do this on iOS as well, but it risks an obstruction of justice charge.

Since I use fingerprint authentication for the vast majority of authentications with my iPhone, I set the actual pin to a complex alphanumeric password. I only have to enter it a few times a week, so it’s worth the hassle. In this sense fingerprint authentication has significantly improved the security of my phone, because it’s so much more difficult to shoulder surf that alphanumeric password that it is a 4/6 digit numeric pin. Biometric authentication definitely seems like a win to me.

I don't want to be that 'if you've got nothing to hide then' guy but why are people so worried about what border agents in particular will see on their cell phone?

I am not saying that I wouldn't mind at all if my phone was searched. But I can't think of anything in particular that I would be concerned about if it was. Sure in theory the agent could remember some personal information and come back later and use that info or pass it to someone for some nefarious purpose. But that's a pretty small chance event. It's more or less the equivalent of thinking you will get sick because you find a hair in your food at the restaurant. You don't like it and you send it back or demand a refund but the actual harm is more mental in nature.

Or what am I missing here?

There's lots of stuff I'd rather not be seen by strangers. Love notes to/from my wife, love notes to/from my mistress, photos of myself or others that I'd rather not be seen by strangers, financial information (through online banking apps, etc), just to name a few.

How do I know the agent isn't downloading the naked pictures of my wife? (There are lots of reasons to not keep such pictures on my phone, but "Because border agents may see them" should not be one of them)

And there's just the futility of it -- if I really had something to hide from the government, I wouldn't keep it on my phone (or if I did, I'd keep it hidden).

the actual harm is more mental in nature.

That doesn't make it any less real - the government should not make me feel violated.

This was a particularly concerning case: https://www.theverge.com/2017/2/12/14583124/nasa-sidd-bikkan...

    Bikkannavar says he was detained by US Customs and Border Patrol and
    pressured to give the CBP agents his phone and access PIN. Since the phone
    was issued by NASA, it may have contained sensitive material that wasn’t
    supposed to be shared. Bikkannavar’s phone was returned to him after it was
    searched by CBP, but he doesn’t know exactly what information officials
    might have taken from the device.
Once he unlocked his phone, they were gone with it for 40 minutes. Plenty of time to copy the device contents and peruse at their leisure.

Depends on how difficult it is to produce an automated system whereby one needs only plug an unlocked phone into a computer in order to hoover up all locally-stored messages, contacts, saved passwords, synced browser histories, etc. The agent themselves doesn't need to care, they can just be instructed to "unlock the phone, plug it in to this computer, wait until the progress bar finishes, hand phone back". Are you on Github? Is your Github password saved in your browser? Do you have commit access to any marginally important projects? God only knows who has commit access to those projects now. Considering this is HN, I imagine this is an attack vector that a lot of people here are particularly concerned about. Given how much code we use that's written by others, it ought to concern the rest of us as well. :)

This is not an attack. This is iTunes. This is trivial to do.

Not sure why you have been downvoted, since it is a valid question and you phrased it in a productive way.

I think the reason people focus on border searching so much is that it's presumed to be one of the easiest entry points for the government to spy on you.

Many security researchers, such as people who work on Tor and related technologies, have had their electronics searched and sometimes seized at the border. That makes it particularly relevant to the HN crowd.

For me, I seriously doubt the government would ever be interested in my data. But imagine that the CEO of my company is suspected of some kind of crime. Perhaps when I am at the border they try to image my phone, in order to get access to things which may allow them to control other company assets (servers, etc), which would get them closer to their target.

I doubt it would ever happen to me, but I'd rather take precautions. Being searched at the border seems much more likely to me than, say, the NSA/FBI targeting me remotely with malware.

My Canadian friend (woman) made the mistake of making a connecting flight in the States on her way to visit me in Mexico.

The border agent accused her of prostitution. He wanted to get into her phone to see her latest Facebook Messenger and Tinder correspondence. She let him because, of course, it's easier than canceling her entire vacation plans.

Now, the tips in this thread wouldn't have helped her. But do you see how it's not just "I don't have anything to hide?" What about not letting some border agent power trip all over you? Reading your Tinder messages for fucks sake?

What if they automate the search and copy all your data for further mining and save all that info to see if you're not "desirable" for the "people in power"?

It’s not about agents looking at your phone; it’s about the slow erosion of our rights among other things

It is not only you. It exposes everybody to be blackmailed by the government or who every gets hold of the information. Judges, politicians, businessmen, scientists, journalists, etc, etc. It weakens the democratic fabric.

I'm not sure if it's relevant but Dubai and UAE have been known to put people in prison for quite unbelievable reasons often as they were transferring through the airport.

One such instance included a British guy who had shared a Facebook post recommending giving aid to refugees, Dubai police put him in jail over it - that was over a year ago and I believe he's still there. One instance included a microscopic amount of cannabis found on the sole of a passengers shoe, he got almost 4 years in prison.

There are many many other instances, if locking your phone stops them from even chancing it, it might be worthwhile.

What I worry about is Google Authenticator codes for several financial institutions, getting vacuumed out of the phone with everything else and stored on some insufficiently-secured database.

One phrase I've used with success in the past is that I have secure governmental cryptographic codes owned by someone other than myself in my laptop/bag/iphone and am not allowed under law to disclose them to unknown parties such as yourself.

Most first level TSA/DHS people measure the cost/benefit of my assertion and back off.

Just because you are innocent and have nothing to hide, it does not mean they cannot still use the evidence you provide to convict you. Basically, it comes back to the same reason that many lawyers generally advise you to never talk to the police (1). IANAL.

(1) https://www.vice.com/en_us/article/mvkgnp/law-professor-poli...

Do you have any work data or email on your device? What is your employer's policy about granting 3rd parties access to such data? Could you be fired for doing so?

The issue is that they can compel you to unlock the phone and then make an image of the phone's contents which would allow them to clone it to another phone. They would then have full access to everything on your phone including any linked accounts and cloud services.

What's on your cellphone today; who knows what it will be tomorrow.

I'd personally much rather the State fundamentally respected a right to private and family life, much like article 8 of the ECHR, but fat chance of seeing that in the USA any time soon.

You don't even have to do that anymore. With the new OS that will be standard on the new iPhones, you can just tap the power button 5 times to put it into a forced-lock mode. It'll require the use of your passcode before Face ID or Touch ID can be activated.

In addition, most customs/immigration lines have signs requiring you to turn off your phone.

I really liked this write-up because it focused on the practicality of the various security mechanisms. Most articles I see usually have a blanket statement like "All biometric security mechanisms are bad!". I think this article does a good job comparing the various logins and describing the pros and cons for different people. Specifically, I appreciate the author calling out when people bring up the "What if" edge-cases, where the correct response is you likely have much bigger problems at that point than the security level of your phone.

Specifically, getting more people to have better security on their devices is a very difficult User Experience problem, and Apple's pretty good at solving these kinds of problems.

TouchID moved the ball forward quite a bit, and FaceID will probably go even further.

Obviously neither provide ultimate security, but Apple is in a strategic advantage since they make the hardware and software to make the barn walls and roof super secure, but it does nothing if the front door is left open.

Before TouchID, I set my passcode to 0000 with a four-hour window where I didn't have to reenter it. I only had one set at all because Find My Friends refused to keep me logged in unless I had a passcode set.

With TouchID, I have a complex passcode that I have to enter a couple times a week. It's less secure than some hypothetical setup where I have a complex passcode I have to enter every time I unlock the phone, but it's far more secure than what I was actually doing before.

My android phone forces me to re-enter my passcode every 24 hours.

I think that strikes a nice security median. If someone does get procession of my phone, I only need to stall for less than 24 hours.

The rest of the time, the fingerprint scanner works near perfectly. It's actually faster to use the fingerprint scanner than the standard slide to unlock, which is all I ever had setup on my previous phones.

iOS does the same after 48 hours of not being unlocked or re-authorized. I agree that this seems like a decent security compromise. Anyone with physical access to your phone for more than 48 hours has other vectors to pursue that are far easier than just trying to guess your password.

Is this a standard setting that can be managed?

Here's a dumb question I haven not easily found the answer to.

Can I configure my iphone to require TouchID, FaceID, and a PIN for each unlock of my phone every time?

>All biometric security mechanisms are bad

They are though, if bad == insecure. Customs can make you unlock with fingerprint or face. If you can't lock yourself out, it's not secure.

Biometric security mechanisms are the best security mechanisms.

Because they fall in the category of "will be used" as opposed to perfect security, which almost always falls in the category "won't be used"

That's why you temporarily disable biometric authentication before you go through customs. On iOS, this is as easy as turning the device off, and in iOS 11 it just takes five quick presses of the sleep button.

Customs can do that anyway if you have a password.

Constitutionally, an individual can not be forced to enter a password for law enforcement (including customs agents).

That's the point, "constitutionally" while they lock you up for hours/days on end to obtain the warrant needed to give up your password unless you are willing to stay locked up.

Many border agents, it seems, have the “if you have nothing to hide” mentality. So if you’re refusing to unlock your phone, clearly you’re hiding something.

Get that sweet settlement for wrongful imprisonment.

Or, you know, no settlement, and not even a "sorry, oops" either.

How many times has this actually happened though?

That would be comforting if both (a) and (b) were true:

(a) the world was only US citizens.

(b) nothing unconstitutional ever happened to the first group.

> Customs can do that anyway if you have a password.

Allow me to repeat myself,

If you can't lock yourself out, it's not secure.

For example, some banks have time locks. Nobody gets into the vault, unless it is in a certain window of time.

>If you can't lock yourself out, it's not secure.

It's not secure if you can lock yourself out either.

A court could hold you in contempt for failing to unlock or intently locking.

And a state actor or even mugger could just hurt you or even kill you, in frustration if you don't open it for them.

Re: the pushback the author got on Twitter; I believe in skepticism towards corporations and marketing claims, but the level of cynicism online towards any new tech idea or product seems a bit out of hand. There's a certain trend, on Twitter especially, of people racing to prove they're either more woke or smarter than the teams of people behind things that are yet to even be released. I mean a "wait and see" attitude wrt the actual effectiveness is good, but I don't get why we need to concoct extreme hypotheticals here suggesting Apple is somehow irresponsible for adding an optional feature.

> There's a certain trend, on Twitter especially, of people racing to prove they're either more woke or smarter than the teams of people behind things that are yet to even be released.

This has been a trend of a vocal minority on the internet for as long as I've been connected to it. Remember "No wireless. Less space than a nomad. Lame"?

> Remember "No wireless. Less space than a nomad. Lame"?

I hate linking to that site but this will always be golden: Apple's New Thing (a post from 2001) [1]

[1] https://forums.macrumors.com/threads/apples-new-thing-ipod.5...

"attention awareness" is Apples own description. How long before it's turned against the user to confirm "Advertisement Awareness" or EULA Awareness, etc..?

>Advertisement Awareness

Fortunately Sony already has a patent for this exact scenario. Maybe that will deter Apple from building such a feature (lol).


Given that the authentication methods are "differently secure," wouldn't it be good if we were offered the option to combine them and require both for unlock? I would love to use Face ID + PIN or Touch ID + PIN for better security.

I want this, and also the ability to secure different areas of my phone.

I want to be able to set touch or PINs for certain apps, so that I can have multi level security. Why is there so much emphasis on one master password/touch ID/face ID instead of having multiple security checks?

All of my banking apps, among others (e.g. password manager), include options for both pin and touchID auth. Do we not already have what you’re asking for?

I think he's requesting it on the system level. For example, if iOS allowed you to force PIN+TID when opening a specific (not necessarily secure) app for the first time after an unlock. Intended for apps that don't necessarily have security built in already.

This is especially useful for when the app developer doesn't think there is a reason for them to develop such a system. For example, I don't want my son playing a zombie game, but I myself play it often. I could then lock him out of said game, but still give him my phone to use.

This isn't exactly what you are looking for, but have you tried Guided Access on iOS (or whatever the equivalent is on Android)?

Yes, that is the way I do it currently. It's a real pain though.

Which password manager are you using? 1Password doesn't support TouchID + PIN on iOS, but I wish they would.

I'm using LastPass on Android, and I have it set to require both my fingerprint and my LastPass password.

I've been wondering that too. If possible, it'd be nice to combine username, second-factor, and password, as they all perform different functions that people often ambiguate:

- Your username is who you think you are.

- Your second-factor (faceprint, thumbprint, keyfob) is who you claim to be.

- Your password is your proof.

Setting up a new device does require three factors: Username, Password and a 2FA on an existing device.

But even with a 4 digit PIN most people didn't have PINs set up on their devices, so increasing friction will not result in more security.

If you want max security now, you can simply turn off TouchID or FaceID and us a very long alpha numeric password.

Large public systems could do away with usernames if they assigned random, unique passwords to users out of a large keyspace. Usernames allow the keyspace to be smaller and the keys have less entropy making user chosen keys possible. But then we add key minimum entropy requirements (sometimes using only length as a proxy) in order to combat the freedom given.

Would be interesting to see whether username-less public systems suffer fewer intrusions. Some users would record their passwords in unsafe places, but they couldn't reuse a password eliminating password stuffing as an attack surface.

Of course, if you allow password resets you'd still have a kind of "backup" username presuming a person's email address could be used for the reset. But that wouldn't significantly weaken the security, arguably.

I also believe that TouchID + FaceID would be a good combo as well for the vast majority of users. Both technologies seem to be relatively foolproof and user-friendly. Upping the ante from a security perspective would be a good thing. Unfortunately, that won't come to fruition.

There still must be another mechanism to access it like a password. What if the camera fails? What if you get punched in the face that day?

That’s what it does right now - if Touch ID or face id fails you still have your password to fall back on. I think it may do this automatically if the touch sensor dies but you can always force it with the 5x quick taps on the power button.

It is, but GP is saying if the auto method is FaceID + password and FaceID doesn't work... then what? Clearly just the password isn't enough to get you into the phone.

We can have two options for unlock: (Face ID + PIN) OR (Alphanumeric Passphrase). The former for daily use and the latter as backup or for system restart.

I'm calling it that Touch ID will make a comeback: https://twitter.com/ernsheong/status/908018119595003904

How bout: Touch ID + Face ID. Hahaa

I'm not sure that Touch ID can come back in its current form. Phil stood on stage and told us Touch ID is 50,000 secure and Face ID is 1,000,000 secure.

I think the only way for Touch ID to come back is for it to cover the whole display, so it can authenticate every touch.

Agreed. If you could make it so every single tap was scanned, you could make the phone so secure, yet so simple to use.

The phone could constantly monitor both Face ID and full-screen Touch ID at the same time, and if it detects anything funky going down it can panic and lock. Would close down attacks where someone grabs your phone off you while it's unlocked and quickly disables security.

But then people you hand your phone to wouldn’t be able to use it

Authenticating every touch sounds way overkill. And your friend can't help you navigate around anymore :( Some human element is lost.

> I think the only way for Touch ID to come back is for it to cover the whole display, so it can authenticate every touch.

Touch ID only works on the first try about half the time for me, I would love to bring that experience to all my interactions with my phone.

You could set it up that at least one fingerprint has to be recognised correctly every X seconds.

I highly doubt this would be possible for a long time, but it would be interesting if Apple could equip the entire display a TouchID sensor. If you could scan all of our fingertips, then you could theoretically "sign" every interaction with the display.

Exactly what I'd like as well. I'd like to be able to use one of two methods to unlock: a fingerprint and a modest PIN (with lock-out after a couple of tries), and a long passphrase (also required at boot).

The problem is that the PIN is always dominant over Face ID or Touch ID. Face ID doesn't work? Use your PIN. What happens if you use local 2FA and your Face ID doesn't work? You can't enter the PIN, because that would effectively render 2FA useless. This would probably require some master PIN, which makes things more complex, and increased complexity correlates which reduced security.

It would be interesting if we could specify a particular face pattern to unlock the phone. Imagine you set up your phone to open only if you smile, now if someone picks up your phone and try to unlock it by pointing it at your face, not smiling would be easier than closing your eyes or looking away. Not even mentioning the health benefit of just smiling :)

I would look forward to the headline where the police had to tickle the accused in order to get access to his phone against his will.

It would be awkward to smile/pose before/after a funeral, just because I need to call my mum or check my email...

That being said, I do think that there could be a legitimate use case here. One could set up a particular "emotion" (a face pattern) associated with someone forcing them to unlock a phone using their face. I mean, if someone pulls a gun or a knife on me, I'll probably just do as they say and look at the phone, rather than risk an additional hole in my body. But unlocking a phone and sending a distress call is something I could live with.

y, if someone has a knife or gun I would just give them what they want and worry about a distress call after you're safe instead of getting fancy trying to activate an 'I'm being mugged' feature.

I think that could be a nice feature but would add stress to the situation when you should just be focussed on staying alive trying to remember how to do that special thing or enter an alternate code.

I'm thinking of a situation when the phone, for example, is not enough for them. What if the don't leave you alone after that? What if you're a girl and they are going to try to rape you? What I'm thinking of is not about a "fancy" help-me-I'm-being-mugged "duck-face" pose, but actually a face pattern which, simple enough, could offer assistance in a difficult situation. What if they take away the phone and I'm left with no change of calling for an ambulance?

You are right, though, that this requires some "friction" and probably some self control.

As with almost all things, YOU DON'T HAVE TO USE THAT FEATURE.

They mentioned that it won't unlock if your eyes are closed or you are looking away. Doesn't help in a carjacking scenario, I guess.

Huh. Taking that a step further, instead of Face ID being a (probably) static shot of your face, what if it were a sequence? For example, if you could set your "code" to be smiling and then frowning? Or, given it can detect eye location ("awareness"), looking at different corners of the phone in a particular sequence?

Isn't this already possible by smiling while training your phone?

I am not sure but I think that if you smiled while training your phone you'd still be able to unlock it with any other facial expression

I'd like to see a duress expression. Spitting your tongue out would disable the face authentication and require a PIN.

Another approach might be to have your eyes follow a pattern. Either a static pattern which is always the same, or a dynamic pattern with a challenge each time.

Might not be completely universal though...

I seem to recall a video-based one that did something like this - it would tell you to make a specific facial posture to unlock the device, maybe Google's?

This is a really well-written, considered view of the trade-offs for using different options for security. I learned a lot from reading, and the plain language discussion of the topic allows most any reader to better understand the trade-offs present for each option.

Much appreciated to the original author - it takes a good deal of time and effort to write something that lucid. Thanks.

> a thread emerged about abusive spouses. Now if I'm honest, I didn't see that angle coming and it made me curious - what is the angle? I mean how does Face ID pose a greater threat to victims of domestic violence than the previous auth models?

If someone has the PIN and the phone, they can get in without the person (without their biometrics.) Fingerprints and Face recognition increase the chances that an abusive spouse needs the other person every time they access the phone.

Parents who have their childrens passwords are in the same situation -- they can't snoop on their kids biometrically secured phone (like reading a kids diary in the old days.) They have to have the kids open the phone, which means the kids know that it's happening.

I'm not sure, but I know with my spouse and I, we always put each others Biometrics into each other's devices (I add her fingerprints to my phone and vice versa), share a lastpass account so we know each other's passwords, add our email accounts to each other's devices so we can always look at each others email, etc. If you are married to someone and don't trust them enough to do the same I have to question the foundation the marriage is built on. As for kids, we just don't allow them to have a phone or other device until they are old enough to buy it with their own money, which so far has never happened until they are nearly 18 and getting ready to leave for college anyway.

Not trying to tell you how to live your life, but being an open book to the other isn‘t very trusting. If your trust is build around being able to spy on the other, maybe you aren‘t trusting each other that much. My wife and I both have our own separate phones and computers without each other being able to access it and I trust her not to do anything that goes against our interests and vice versa. That‘s what trust is, not having to know what the other does and knowing they are doing the right thing.

I also let my spouse unlock my phone. It's not so much "being an open book" as it is "I trust her not to snoop, and sometimes it's convenient that she can open a map on my phone." That meets your definition of trust: not having to know what she does because I know she'll do the right thing.

Very fair point. In fact, my wife actually knows my phones passcode for the exact same reason. „Open book“ was definitely the wrong phrasing here, I wasn‘t trying to say that you have to keep everything secret from your spouse. Just that the exact opposite of that also strikes me as very distrusting.

There are also lots of people in abusive (or just "problematic") relationships for whom some privacy around their phone can be extremely valuable. That's why I'd prefer the societal norm to be NOT sharing credentials. Which of course doesn't mean that all people have to follow that.

I just got tired of having to unlock my phone every time I asked my boyfriend to send a message while I was driving, change what music was playing, etc., so I enrolled his fingerprint. I imagine most people that do this are thinking on the same lines... I would say that this is an indication of trust, as I trust him not to go snooping around on my phone.

Its kind of perverse to jump to spying, don't you think? I'm not spying on my wife or she on me. Even though I have her email on my phone, in the past year I think I've only been in there twice, both times when she asked me to look for something for her. I know she doesn't look at mine either because she just doesn't care.

> If you are married to someone and don't trust them enough to do the same I have to question the foundation the marriage is built on.

This is such a stupid attitude. It is your way or nothing, right?

What the hell

I don't think it's completely insane, but I think the parent comment overestimates the proportion of marriages that have that kind of trust. If people didn't get married without 100% trust then most people wouldn't be married, and evidence is that people prefer imperfect relationships to loneliness.

It's also useful to have a norm against doing these things even if you have 100% trust, because it's probably the case that people are too readily fooled (by themselves or a manipulative partner) into thinking they have 100 % trust when they actually don't. So having a norm against sharing everything protects the people who need it, even if you're not among those people.

> Fingerprints and Face recognition increase the chances that an abusive spouse needs the other person every time they access the phone.

No they don’t - you have always been able to use a PIN a instead of touchID, so knowing the pin still works just as well with or without it.

So... wouldn't that make biometrics a lesser threat?

Would be interesting to enable voice authentication contemporaneous with face scanning to make sure the lipreading matched the utterance matches the voiceprint matches the expected face. Bonus points that a vocal channel could be used to detect duress (especially if accompanied by, say, raised eyebrows) and either require further authentication (passphrase entry) or a "false unlock" to reveal only a nearly factory fresh app and data underlying. Could also potentially send a notification to friends that your phone had just been unlocked under duress. Bonus points for in parallel hard-scrubbing the underlying true data while displaying the false boring phone interface.

Near-field worn devices.

http://nfcring.com is an example of what I have in mind.

What I'd like to see is this tied into an identity system, such that the ring (or other very-hard-to-misplace, but replaceable and discardable) token is not itself an identity, but rather an access token to an identity store which can present any given identity to any given system.

That might be a consistent identity across multiple sessions or unique identities on each session. The identity might be tied to some central certifying agency (e.g., a motor vehicles department or national pensions fund), or not.

There are several elements of this which I'd like to see developed further, including how keys might be reconstructed or recovered using a quorum system of trusted sources (divide your key into pieces, share those amongst friends, family, or some local authority, such that key loss need not equal data loss), and possibly via law enforcement.

I'm also looking at the possibility of a public ledger system which might allow for both workfactor requirements and public disclosure of keys being revealed. This may be a viable application of crypto, though I'm not entirely sure of this.

(The feature might also be optional -- you could take the risk of key loss, or allow for recovery. But the present situation with PKI of losing access to all previously-encrypted data in the event of key loss would be mitigated.)

There's also the requirement for devices to have support for near-field readers. I'm told this is alreadly largely a reality, though my reading of specs for various mobile devices suggests otherwise.

The biggest challenges through all of this are not the technology itself, but the adoption, requirement, and enforcement of standards, including availability of tokens at low or no end-user price. Trust of the information ecosystem overall might be a suitable incentive for this to happen.

So, someone steals the NFC ring and then own the phone? Ring + heat detection of PIN tap pattern will end up giving a false sense of 2FA. (not sure how the ring auths on being worn, didnt see it on the website).

You can repudiate the device.

Stealing physical data itself is far harder than password appropriation.

And a PIN or password / passphrase, plus rate limiting, might still thwart scale attacks.

You're raising attack costs significantly.

Or cutting off a finger or hand?

Difficult to arrange by a phishing email, website, or trojan.

Possible to countermeasure as well.

Personally, I'd like to see identity tied to my smartwatch, with authentication happening via capacitive coupling + Bluetooth.

The way I'm envisioning it:

1. Physically touch the object you want to authenticate to. (E.g. Computer, payment terminal, smart lock, etc.) Watch uses capacitive coupling to bootstrap a Bluetooth connection to that device.

2. Device requests authentication & authorization from Watch.

3. Watch either authenticates you instantly (for lower security applications), or requests you to confirm the transaction with your fingerprint/face/PIN (higher security applications)

This method would also enable a lot of other neat tricks to further increase security, like checking your heart rate and refusing to authenticate if you're asleep or the watch isn't strapped to your wrist anymore, requiring additional authentication methods to unlock your watch after you take it off, displaying the dollar amount for monetary transactions on your watch when asking for approval, etc.

A watch or bracelet could also work, of course. Even a neck pendant if that's your thing. The point is physical, on your person, and crypto based on near field.

The problem with longer ranges, even just a few cm, is the prospect for snooping or triggerring unintended authentications. My preference would be mm range.

> The problem with longer ranges, even just a few cm, is the prospect for snooping or triggerring unintended authentications.

The advantage of using a watch (or another device with a built-in screen) is that it avoids exactly that problem. When you authenticate, you have to physically press a button on the device, and the screen tells you precisely what it is you're authorizing.

Using capacitive coupling as the initial communication channel would also help with that, since you'd have to actually touch the object you want to authenticate to with your bare skin.

Both good points. I've been thinking of some contact / button interaction as well, though that's a toss between keeping the device as physically and electrically simple as possible, vs. some level of interaction. A circuit-completion button on a ring might work, which wouldn't require, say, an additional battery. Though battery life would quite likely be years.

Do you have any refs on capacitive coupling? Is that essentially touchscreen devices? How does that fare in exposed / outdoor environments? I'm thinking of wide applications, and something which wouldn't operate at, say, Tokyo Subway levels of use and demand aren't particularly amenable.

(That's tabling the discussion of whether or not you'd want to have per-use charges for transit use or want to offer that as a public service, or only filter based on individuals, etc.)

Field range might be set by speed-of-light delays. Roughly a nanosecond per 30cm (about 10 foot).

Given a 4 GHz clockspeed, your time resolution is about 0.25 nanosecond, or 7.5 cm -- call it 3 inches.

Slightly ironically the TLS is broken on that nfcring website.

Yeah. I ... had to edit the URL, as I'm used to specifying https rather than http these days.

There are a few other flags raised about that particular implementation, though the concept itself is the key point. The idea of a signet ring to authenticate, sign, access, pay, claim, and/or decrypt seems useful.

You’re not alone - I do that every time I copy or share a URL too, it’s a good habit to get people into and sites that fail to provide working HTTPS don’t really have an excuse these days.

Eh. For me having smart unlock attached to my watch works. Sure, the range is farther than with nfc, but it's acceptable to me.

It appears that FaceId only supports a single face (unlike TouchId, which supports multiple fingers).

Maybe this use case isn't common, but my wife frequently needs access to my phone. Usually while driving, to change GPS routing, or playlist, or respond to SMS. With TouchId, she can do so without my PIN. With FaceId, she needs my PIN.

This strikes me as both less secure and quite annoying. Now, I have to repeat my PIN out loud while she types it into the device. Or, force her to memorize it (in addition to her own PIN, and I have to remember hers for the reverse situation).

For me it's not so much the paranoia or the degree of security (which is an arguable point in itself) but the commodity of it. Touch ID lets me unlock my devices without having to re-position my upper body or move them in (practically) any way, and Face ID feels awkward (I'm typing this on the device that is likely an exception to that - a Microsoft Surface Pro - and Windows Hello's face recognition works beautifully, but I am _always_ facing it when I need it to unlock, so...)

> Face ID feels awkward

Have you used it?

There's some app that uses the face++ api to do login using facial recognition, I can't recall the name of it off the top of my head but maybe you can find it with that info.

edit: how could I forget, it's the little known app called Alipay.\s

But wht was wrong with TouchID ? Were there any examples of it being weak security. What will be after Touch ID? Will Apple continue progress and built in PinchID - a tiny needle that sting you to test if you are you based on your blood/DNA?

This is a serious question. Because of there was noting wrong with Touch then why is it removed from new phone and replaced with Face ID.

Im also concerned about the data Apple will collect. I assume information about your face have to be very detailed so that this FaceID is secure. Is it really that of a push to see article in 6 months: FBI got a copy of whole Apple FaceDatabase and was able to identify and find a very dengerous criminal.

TouchID was removed because it took up space on the front of the phone and Apple wanted the screen to be bigger. There's no deeper reason than that.

> Will Apple continue progress and built in PinchID - a tiny needle that sting you to test if you are you based on your blood/DNA?

I struggle to believe you when you say that's a serious question...

> Im also concerned about the data Apple will collect.

The FaceID data will be stored in the secure enclave locally on your phone, just like TouchID data was. Apple will not collect it.

> TouchID was removed because it took up space on the front of the phone and Apple wanted the screen to be bigger. There's no deeper reason than that.

The Pixel handset has the fingerprint sensor on the back of the phone. It appears to work quite well. Much of this needless outrage could be obviated by allowing multiple simultaneous biometrics for auth; while taking the phone out of your pocket, place your finger on the sensor to initiate FaceID.

> The Pixel handset has the fingerprint sensor on the back of the phone.

How would I use my phone on a desk without lifting it up? I can do that with TouchID, but if the sensor is on the back, I can’t.

I confess, never in my years of phone ownership have I had that specific use case, or considered that it might be important for others. Do you find yourself doing that often? If so, can I ask what for?

You can't really do that with the face id either, unless you loom over your desk in an awkward way.

From their promo video of iPhone X, FaceID works at angles over 45° (see the swimmer portion)

> multiple simultaneous biometrics

That's very un-Apple.

Second, facial features are more unique than fingerprints - according to apple's own presentation, there's a 1 in 10.000 chance that prints from different people would unlock it. With face ID, this becomes 1 in 50.000 (iirc).

Unless you have a twin.

Twins have different fingerprints.

I believe they said 1:50,000 and 1:1,000,000.

He's not correcting the use of a period, he's correcting the numbers themselves. 1:50,000 and 1:1,000,000 vs 1:10,000 and 1:50,000

TouchID is trivially defeated by a 6-year-old:


Did you read the article? Troy Hunt addressed that.

All the more reason for FaceID

The FaceDatabase as you imply is stored on the phone in a secure enclave. None of this stuff is sent to Apple.

Not everybody has fingerprints. I may be the minority here, but I look forward to not having to enter my pin each time.

There are more people who cover their face (e.g. Muslim women wearing burkas) than people who don't have fingerprints. I wish they had both TouchID and FaceID.

Well, they do.

You have to decide which type of biometrics to use when you choose your model, but that's fine for a lot of cases (such as those who wear burkas or those who have no fingerprints).

Hopefully the return period gives people enough time to be sure they've made the right choice.

TouchID also is problematic if you're wearing gloves, which people who don't live in San Francisco do during non-trivial portions of the year.

Not sure if this applies to Apple's implementation, but my phone fingerprint recognition fails if my finger is wet (sweat, washing hands).

I haven't tried training it with a wet fingerprint.

This definitely applies to Apple's implementation. It's infuriatingly sensitive to even damp fingers, in my experience.

> people who don't live in San Francisco

I find this humorous because for me, an Australian, San Francisco is quite cold and I wore gloves there.

> TouchID also is problematic if you're wearing gloves

And the touchscreen?

There are plenty of gloves designed to work with capacitative touchscreens (small wire mesh in the fingertips) but none that let you use TouchID.

As long as you can easily unbutton your shirt:


Also fails constantly when you're doing certain physical tasks. It becomes nearly useless after a few hours of doing landscaping work.

Curious how much face one needs to show during winter for FaceID to work.

Well, you can always still use your PIN, same as you can with Face ID.

It was so they could remove the home button.

> without having to re-position my upper body

why would you unlock your phone if you're not going to look at it? I don't understand this argument.

I unlock the phone while it's still in my pocket, by the time it reaches eye level, it's already unlocked. And with a few muscles memory tricks, there's even a chance I have opened the right app without even looking in the fraction of a second it took me to take the phone out of my pocket.

I've actually never seen anyone doing that. I don't think that's a valid argument against Face ID. It's still faster than a PIN code and (seems) more secure than touch ID.

You've never seen someone unlocking their phone while they're taking it out ? It's fast, you might have missed it.

as fast as Face ID probably. I know I've seen people unlocking their phone by mistake in their pockets for sure.

I do the same even with PIN. I know where the keys are and have it unlocked before I'm looking at it.

my phone is currently sat on my desk, about 10 inches from my right arm. I can, and do, check messages on it, by only repositioning my arm to unlock it.

I easily read any messages by glancing at the phone, never coming into any decent imaging range.

Turn on the front camera, lay the phone on your desk. Can you see your face in the image? Then you can unlock your phone by glancing over at it.

My phone is on my desk next to my laptop in the same place where I've been texting my wife for the past five minutes. I have turned on the front facing camera.

My phone can see the very corner of my head and about half of my eyebrow. I do not want that to be enough to unlock my phone.

Apple would be happy to sell you a watch for that…

If you have a mac you can use Messages to check your message on your laptop directly.

Pretty much everyone at your standard 9-5 office job has their personal phone with them all the time and their personal computer with them none of the time.

You can work on a mac...

Signing in to personal accounts on a work computer is a terrible privacy practice and also unprofessional IMO.

One reason would be ApplePay. At least in Europe most payment terminals have their NFC sensors on their side, so you're supposed to hold your phone flat with the side of a box, so the selfie camera just points to the wall on your left. In that position, there is no way that the camera of your iPhone could see your face.

Compare this to Touch ID, where you hold your phone to that box with your finger on the sensor and half a second later you've paid - very convenient.

I would take a guess that Apple has not yet tested any iPhone X with a European payment terminals and will get a lot of angry emails once people try that combination.

I wonder if you can Face ID to work with your butt or other body parts, similar to how people got Touch ID to work for certain "non-thumb" parts of the body.

Good, balanced, pragmatic discussion.

Honestly, the only downside I can see vs. TouchID is that you can in theory point the phone at the person and unlock it. However this is balanced out by not working while unconcious.

PINs as discused are not directly comparable.

How does it not work while unconscious? Eyes need to be open?

That and looking at the device.

What I want to know is what face data is shared with 3rd parties like snapchat. That seems like the bigger threat, and no one is really discussing that.

Unless Apple provides only an API to interact with the model, not actually use it. Which, considering it’s Apple, I’m sure they did.

If it is like touch id, then nothing at all.

So, with Face ID, can you prevent someone trying to compel you to unlock your device by simply closing your eyes or looking away?

Yes. If you have time to do it, also press 5 times on the on/off button, it disables Face ID and forces entering the PIN. It's an iOS 11 feature, I just tried on my iPhone 6 and it disabled Touch ID.

That was the way it was described in the Keynote

How do you know when to open your eyes again?

You can open your eyes but not look at the phone.

What if users were able to disable FaceID by configuring blinking x times or by having their eyes closed for a certain time period? Maybe requiring FaceID + a different PIN after recognizing that locking over the lock.

what about FaceID + pin? that would mean someone would have to know your pin as well as have access to your face.

you also wouldn't have to look so paranoid while entering the pin. and pin by itself would be of little value.

For phone-based biometrics, the PIN has always been a backup for the fingerprint/face recognition, because these things aren't 100% reliable. Not even considering any security aspects here, just practicalities, like your hands are dirty or your face isn't being recognised (perhaps you've got bandages on your face or whatever). Having the PIN as a replacement is a good thing in these cases, otherwise you could be locked out of your device when you actually want to use it.

For 'extreme' security situations, you might as well just have a long secret PIN and no biometrics.

FaceID + shortpin = unlock

longpin = unlock

Yes! I'd love this.

Too complicated

it can just be a setting you either enable or don't

The author started off saying how less than 1% of Dropbox users use two-factor authentication. What good is such a scheme when nobody is going to use it?

Heck there are a lot of people who don't use TouchID on iPhones. There are a lot of people who don't use ANYTHING.

I think you're right that the number of people who would use such a system is trivial (compared to sales).

Stolen iPhones should be worthless.

Apple need to create a system where stolen phones can be reported to them, Apple can then contact the owner/verify they are stolen. And then add them to a stolen list and disable calling/apps on those phones. And display an overlay on the screen THIS PHONE IS STOLEN.

Every iphone would come with an validate phone feature that is accessible even when locked that can authenticate the iPhone for anyone thinking of buying it.

The potential buyer can check if the iPhone is stolen by using the feature that is allowed to connect to the internet and validate the phone.

They need to make it where stolen iPhones are worthless so when you are getting mugged criminals won't even want it.

Obviously have an option setup where you can transfer ownership of your phone. Maybe with a 7 day waiting period.

Find My iPhone + iCloud is what you're describing.

Even a DFU restore of the device won't help a thief, as the activation process will simply ask for your iCloud login and will display a "Message From Owner" that you can set at icloud.com indicating the device was stolen, making it much harder for someone to purchase and claim ignorance about the origins.

Can confirm. We ran into this problem in my company as we had contractors setup phones and turn on Find my iPhone. We ended having to do some not so great things to make it so we could use the devices again.

What not-so-great things?

y, that's true but I feel like Apple could do more so criminals don't even want to take iPhones, Apple could put things in place where they aren't worth anything on the secondary market. Basically unusable and could lead the cops to your house.

This way if you're on the subway you can have your phone out without worrying about getting robbed.

It your house gets robbed they would leave your phones because they can't be sold or reused.

They already are. You can sign in to your Apple ID, nuke your phone from orbit, put it in Lost Mode so that it can't be used for anything other than 911 calls, etc etc etc. And you can call your carrier, report it stolen, and bam - the ESN is blacklisted and the phone is a brick.

I'm guessing they are still useful for parts though? Screens, batteries, cameras....all of that still works even if the motherboard is disabled.

Sure, but the value of parts from a stolen device is much less than the value of a working device.

But the demand for parts is a lot higher than that for stolen phones.

You might not buy a phone from an internet cafe that sells 2nd hand phones, but you might buy a replacement screen from an internet cafe that does repairs.

As others note, it's already possible to remotely wipe/lock a stolen phone, and if you report it stolen to the carrier, they'll list the IMEI in a database shared to other carriers; many will refuse to activate/connect when their SIM is inserted into a phone with a known-stolen IMEI.

Short of visibly, physically destroying the phone, though, there's nothing you can do to prevent this. A criminal doesn't care that the phone is a brick; they'll sell it to someone and be miles away with the money before the buyer realizes the phone is useless.

Once it is reported stolen, it could capture the face of everyone who tries to access it. If you assign your police case number in iCloud, it could automatically send your police department these faces (with times and locations) to expedite recovery. With all the face databases popping up, they'll easily have a short list of people to follow up with.

Then again, this might create lots of other problems.

Find My iPhone has a feature called Activation Lock which matches up with what you're describing: https://support.apple.com/en-us/HT201365

The phone won't do anything until the person who turned on the lock turns it off again.

IIRC if "Find My iPhone" is enabled then the phone is worthless (except for parts)

Activation Lock does exactly what you want and already exists. If someone steals an iPhone associated with an iCloud account, it's completely useless to the thief.

> Apple need to create a system where stolen phones can be reported to them, Apple can then contact the owner/verify they are stolen. And then add them to a stolen list and disable calling/apps on those phones. And display an overlay on the screen THIS PHONE IS STOLEN.

Why would I want to give a third party the ability to brick my phone? If Apple has that power, then a disgruntled Apple employee can use that power and a government can compel Apple to use that power.

My computing devices are my computing devices. Only I should have the ability to do anything to them.

The complete business model is taking the p*ss imho. I am seeing more than a number of people reverting to simple €20 nokias for basic telephone + sms usage on top of a gadget / secondary device for consumption or mobile business.

Apple sold over 200 million iPhones last year, and they make up a relatively small proportion of the overall smartphone market. I don't think there are very many people like you describe.

Free "rein".

Nice article. However:

> It's alarming not just because the number is so low, but because Dropbox holds such valuable information for so many people.

I'd suggest that Dropbox users somewhat self select for those not as concerned about security as others. And more concerned about availability.

Dropbox does not encrypt your data server side (or at the very least, can easily decrypt it). And they have proponents of warrantless surveillance on their board:


> Dropbox does not encrypt your data server side (or at the very least, can easily decrypt it).

I think claims like this need to be backed up.

Now, obviously a biased source, but Dropbox itself says this:

"Each file is split into discrete blocks, which are encrypted using a strong cipher. Only blocks that have been modified are synced. Each individual encrypted file block is retrieved based on its hash value, and an additional layer of encryption is provided for all file blocks at rest using a strong cipher. Both dedicated internal security teams and third-party security specialists protect these services through the identification and mitigation of risks and vulnerabilities. These groups conduct regular application, network, and other security testing and auditing to ensure the security of our back-end network. In addition, our responsible disclosure policy promotes the discovery and reporting of security vulnerabilities." [0]

So we have files that are broken apart, each part encrypted, then the whole combination encrypted again, then lots of security auditing in-house and outside, and with incentives for people that discover flaws to report them. That seems pretty industry-standard to me, but I'd like to know more.

I really have some difficulty imagining a company like Dropbox, which knows how important the documents it stores are, being careless with security. Not saying they may not be, but it's going to take more than an HN comment that includes some politicized perspective about the Bush administration to convince me.

Furthermore, this article [1] claims that Dropbox encrypts files on the server even stronger than Google does. It also points out that user behavior is usually the main security hole, which will always be true with any service.

[0] https://www.dropbox.com/security [1] https://www.virtru.com/blog/dropbox-encryption/

Yes, Dropbox uses encryption in transit and at rest, and security would certainly a top priority for any custodian of that much data. Dropbox is an industry leader, and in many ways sets the course for the entire industry to follow in this regard.

The issue with Dropbox is that they also have access to your encryption keys, which means they can easily decrypt and access your files, at their discretion.

According to Drew Houston (Dropbox CEO), they need access to your files to offer features like search, to be able to better understand how you're using the service, ability to integrate with third-parties, and for law enforcement. Some of these "trade-offs" are mentioned by Mr. Houston himself in this interview when responding to criticism from Edward Snowden a few years ago: https://techcrunch.com/2014/11/04/dropboxs-drew-houston-resp...

More to the point, giving Dropbox (and their affiliates and trusted third-parties) permission to access to your files is a key provision of the Dropbox terms of service:

Our Services also provide you with features like photo thumbnails, document previews, commenting, easy sorting, editing, sharing and searching. These and other features may require our systems to access, store and scan Your Stuff. You give us permission to do those things, and this permission extends to our affiliates and trusted third parties we work with.


As Mr. Houston said in the article referenced above, if you want better encryption there are alternatives.

Disclaimer: I work at Sync.com

It does sound like they can easily decrypt the data, since they have the keys.

> I'd suggest that Dropbox users somewhat self select for those not as concerned about security as others. And more concerned about availability.

I would rather say that Dropbox is being used by many people without tech knowledge. And while they might be concerned about security, they often just don't know how improtant 2 factor authentication is. At least that's what I can see for some friends & family.

I have tech knowledge, but I had absolutely no knowledge that Dropbox offered 2-factor.

I don't keep confidential stuff in DB because, I know that the company effectively has access to everything. Nonetheless, 2 factor sounds interesting. So I look at this:


Right. Now I understand why so few people have it enabled.

Explain? You read a page about two-step and say that explains why no one has enabled it? You claim to have tech knowledge, but are not able to turn on this simple security setting (or even know it exists, despite that it's listed very clearly in your Dropbox settings page)?

I use two-factor/two-step verification on every single service I have, including all social media accounts, email accounts, etc. Most major services/sites these days provide it. The way Dropbox does it is no different than any others; it takes 2 minutes to set it up. What did you find difficult about it?

> You claim to have tech knowledge, but are not able to turn on this simple security setting

GP didn't say _they_ can't enable it after reading the help page. I think they are implying that the very detailed help page looks long and complicated to a non-techie (who might not even understand the benefit of going through such a hurdle in the first place).

> Before enabling two-step verification, you'll receive ten 8-digit backup codes. It is very important that you write these codes down and store them somewhere safe.

Do any of your other systems handle recovery like that?

eeh, since when does u2a protect against back-end breaches?

thats just a security layer against phishing or password leaks...

don't get me wrong, i'd advice everyone to use it for anything remotely critical, because its pretty easy to setup and live with, but it really doesnt help against state actors or hackers that compromised the data servers.

A Dropbox hack is nothing most users have to protect themselves against. Same as with a google breach where GMail data gets leaked.

As a personal user of no special interest, no one would use a potential Dropbox vulnerability to just get your data. If you secure your end you'll be fine. That's different for big corporations or people with a public profile (e.g. politicians). In this case you have to ensure that malicious actors with a lot of money and knowledge cannot gain access. But then again, self hosting is likely less secure than Dropbox or Google.

> since when does u2a protect against back-end breaches?

Because if someone steals your DB password, they still won't be able to login to your account.

Maybe they won't have to, if they also stole your data and found a way to decrypt it, but since those are different things, it is plausible that there could be a leak of login information without a leak of data, in which case your two-factor authentication would keep the attackers out of your data.

> since when does u2a protect against back-end breaches?

It doesn't have to. For 99.999% people, the two most realistic threats are:

* There is a keylogger on some computer where you access your Dropbox, for example at a print shop, * You use the same password on many sites, one gets compromised, and automated bots try to access your Dropbox account.

Confidentiality, Integrity and Availability (CIA). Those are all part of information security.

This is actually quite interesting, since it is a bit like CAP theorem. When you increase confidentiality and integrity, you might be affecting availability in a negative way. Take Dropbox as an example. Since they don't have efficient end-to-end encryption offering, they can offer you password resets (=availability of data is good). Add in secure end-to-end encryption and password resets won't be enough, you need to have in addition good backups for the encryption keys to ensure access to the data.

My problem with dropbox alternatives is that they are either far more expensive or don't run on linux (with syncing).

https://spideroak.com/one/ runs on Linux. I have not checked how it compares to DropBox pricing wise, but ticks all my security/privacy boxes

you can change your pin. you can't change your face.

Which is why it's nice that none of the biometric auths can be used without also having a PIN for backup auth. And also why it's nice that you can disable biometrics entirely.

The comment was aimed at the Snowden example. If Snowden thought his pin was compromised he could always change his pin, but once his face is compromised what does he doe?

This entire article was explicitly written to address this line of thinking, and IMO does so very well.

not use faceID?

That breaks both ways: only a fairly advanced attacker could "change their face" to access your phone. So Face ID still covers the 99% of cases Troy talks about. For the rest, I'm not sure a PIN works, either, so they'd have to use a password.

That's not true at all.

So you can change your face, and you can't change your PIN?

Surprisingly true with most end users I deal with.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact