I would advise against Google Authenticator as a backup as it really defeats the point of a hardware token.
Google Authenticator stores the TOTP secret in plaintext on your device where the potential exists for it to be stolen. An adversary that exploits your phone can generate TOTP tokens as they like and ignore the fact you have a hardware token. If you are going to use Google Authenticator it is your weakest link and a security token buys you no added security, only ease of use.
The typical goal of a security token is to be able to assert: "No one can log into my account without this physical device or an offline backup token from my safe"
To acheive this consider a device with built-in TOTP support in addition to U2F. All current Yubikeys fit the bill here as as well as some Nitrokey models. Desktop or Android users can use the either USB or NFC devices but it is worth noting that iOS lacks support for either which means you would need a desktop or Android device to fetch TOTP tokens for an iPhone.
You can use the open source "Yubico Authenticator" apps to store your TOTP secrets in your key alongside your U2F secret. Now both methods use the same hardware and your phone/computer only get handed OTP codes from the key if present, but can't generate them itself.
Added bonus is now you can now use security token backed login even on the majority of sites/browsers today that lack U2F support.
Extra bonus is these keys can be used for ssh without any server changes. Security token all the things :)
The attacker who gets access to the filesystem of your phone can almost certainly defeat any "encryption" a TOTP authenticator would use to protect secrets, so the premise of phone-based authenticators is that your phone isn't going to get compromised.
This is reasonable when you consider that if your computer --- the least secure device you own --- is compromised, your attacker is virtually certain to get your email account with it, because you'll have (at some near point) a logged-in session.
Meanwhile: the #1 concern that laypeople have about security tokens is that they'll be locked out of their account when they lose the token. Authenticator (or Duo) is a perfectly sane answer to that concern.
Finally, it's worth adding that at least the last time I helped someone set this up, you can't remove your phone number as a factor from your Google account until you have TOTP set up. Your phone number is an extremely insecure login factor.
We use, and recommend, Google Authenticator as a backup login factor.
We do not recommend Yubikey 4 keys for normal users. Nerds on HN might get a kick out of them; I say, go ahead and enjoy yourself. We're trying to solve problems for people who aren't computer experts.
That's a desktop TOTP application. Now not only do they have to have their computer with them to log into their Google account from their phone, but they have to have 2 security keys on the account to remove their phone number from it, and all their backups are physically separated from them, so unless they bring their backup codes with them when they travel, if they lose their key, they're boned.
And all this for what real additional security?
If you want to nerd out and get your security key to do pet tricks like handling your TOTP secrets, I do not have a problem with that. But please don't tell ordinary users they're wrong when they don't do that.
I have Yubico Authenticator installed on my phone and my desktops/laptops.
The android app is a direct fork of google authenticator and has nearly identical UX.
I tap/plug my key to either of them to get a token.
If you want to make the argument TOTP via hardware token is overkill for most users, that is totally fair. On that note though, there is no point in having hardware token via U2F.
Security is ahout the weakest links. All I am saying is anyone going through the trouble to set up U2F as this guide suggests, might as well spend the extra 10 seconds to store their TOTP secret on the key as well, vs exposing it on the phone.
I assume someone that has a hardware token is getting it for a reason: To have assurances an attacker can't log in as them without that token.
You keep saying that, and it keeps being false. The point of U2F isn't to put your secrets in super-secure hardware so you can walk around feeling like you have an HSM hanging from your keychain like a bad-ass. The point of U2F is to defeat phishing attacks, which is how people actually get compromised.
It would not be much of a stretch to say that this U2F guide was written deliberately as a corrective to this mindset.
Your cost/benefit considerations prioritize relatively miniscule security improvements without considering usability costs or diminishing returns. While we're at it, why don't we just use one-time pads? After all, those are impervious to any form of cryptanalysis.
The risk profile for most users does not require a hardware-based auth factor if it results in real world usability sacrifices that end in either 1) accidental misuse or 2) gradual disuse.
You're optimizing for someone compromising the device, great. But the point is that if that risk if on the table, all of this work is essentially meaningless anyway.
"if that risk if on the table, all of this work is essentially meaningless anyway"
I can't agree with this strongly enough. If someone's willing and able to hack your iPhone, then you need more help than a random art major writing a yubikey howto can give you.
Just an offshoot thought, but some random art major writing one of these guides giving a bird's eye overview of the entire process is something most businesses should be able to do themselves. But they don't.
I am only focusing on this bit because it is a few extra seconds of work if you are going to have a hardware token anyway. Why expose your TOTP secret if you don't have to?
I like knowing that if the phones and laptops of someone on my team were compromised, we have some damage control.
With the approach Yubico Authenticator takes an attacker with remote access to your Android Phone and a keylogger on your laptop still can't log in as you remotely.
Granted with more effort that combo can burn you in other ways, but TOTP on a hardware token still gives you some very real reduction in attack surface with no real added user burden. Why not?
Three people have answered that for you, and you keep ignoring them. We're protecting Google accounts, not TOTP secrets. We all understand that you can keep a TOTP secret more secure than a Google account. But since nobody cares about the security of a TOTP secret for a compromised Google account, I'm not sure why we're still talking about this.
Yes but you didn't address the main rebuttal, which is that encrypting or not encrypting the TOTP store is a red herring. If someone has access to the filesystem they can likely walk around the issue of encryption, just as they would on a desktop computer. And if they control execution, encrypting the data becomes utterly moot (and I'd argue most cases of someone gaining filesystem access where the individual cares enough to have 2fa are going to be a jailbroken device, which results in complete debugging and reversing capability, which makes this redundant).
This is not the thing to optimize for. Yes, optimize for ease of use, because take PGP as an example of great security vs horrid usability and look where that's got us. If you do the security improvement analysis from a cost/benefit perspective, you do not win by using a hardware key over regular 2fa apps. Users will shoot themselves in the foot, or simply not use it.
I guess I didn't understnd that bit. It sounds like there is som mistunderstanding about how Yubico authenticator works?
It does not store the secrets on the disk/memory of the phone/laptop at all. It just sends over a code. The device only sees one code and nothing else.
This is also why a user can get a new phone and and just tap the key to the new phone and truck on. Magic.
When a user drops their Google Authenticator phone in the toilet however... bad day.
No, I'm pretty clear about how Yubikey TOTP works. The point is that the threat model doesn't make sense. Any device you can use Yubikey TOTP on is significantly less secure than your iPhone.
Yes, I'm clear that the attacker in this scenario doesn't get your TOTP secrets. If your primary goal is to protect your TOTP secret, I see your point. My problem is, my goal is to protect my actual account. I kind of don't give a shit about my TOTP secret, because Google will give me as many new TOTP secrets as I ask it for, but I only have the one account. If the device I'm securely generating a TOTP secret for is compromised, I'm going to feel pretty silly doing a security theater dance with my Yubikey as my attacker steals my cookie and locks me out of my Google account.
Don't be evasive. It doesn't matter how you secure the channel, with cookies or magic beans: if the attacker controls the device you're using Google Mail from, they've compromised the account. What's the point of having a super secure key to a house with a wide-open hole in the side of it? There is no point, is the answer.
They may of compromised gmail, because gmail is logged in. They however won't get to the other 20 services that I have not recently logged into recently on the device that also use hardware tokens. They don't get the totp secret for my AWS account which I only log into from that device in emrgencies. Etc.
If someone gets remote access to your device it is a very bad day, but you -can- have damage control and a clear picture of what they had access to and what they did not.
If the attacker roots your phone and it has your unlocked password manager on it and google authenticator with all the 2fa secrets... well now they get the entire farm, including for services you don't have active cookies for.
Hardware tokens are not magic, but they are a very useful tool and if we combine enough tools we make the life of an attacker that much harder.
And then there is the recent story of someone getting private emails/password resets/paypal info sent because Gmail ignores the dot in their email address... https://news.ycombinator.com/item?id=14140569 - which makes one wonder if that's already an attack angle being used.
Let me simplify my point: Whether or not the TOTP secret is on the smartphone, encrypted or not, or sent to it from another device, is the wrong attack vector to optimize for.
Getting mainstream users en masse to consistently and correctly use any 2fa is a win.
Furthermore, you're moving the goalposts a bit by using the Yubikey in this scenario. So sure, if someone compromises your phone they don't compromise the Yubikey, but 1) how certain are you that your Yubikey is safer than a modern iPhone or Android model with the crypto and security engineering that entails and 2) how certain are you that accessing your iPhone's filesystem or execution state does not bypass this whole dance entirely?
For you, the minimal security gains might outweigh the usability costs if you know what you're doing. But a hardware token for most people, as the technology currently stands?
The device keeps the secret in self contained memory and never exposes it. It is on the other side of a USB bus or in some cases NFC. There have been local attacks against old designs via side channel attacks etc, but never once a remote attack. The model makes that pretty hard. The phone is essentially zero knowledge. Android and iOS allow arbitrary execution of user installed code and are a massive attack surface and have piles of pubished vulnerabilities.
By moving secrets to very simple easy to reason about devices we get substantial reduction in attack surface.
Also I have helped deploy these to several dozen people, taught workshops etc. It is no harder than teaching people to use Google Authenticator, but lower attack surface.
Use U2F where you can and when you must fall back to TOTP at least you can promise an attacker does not get a free pass to genreate codes whenever they want which is something.
Getting mainstream users to use 2FA? Is that like getting them not to use overly simple PWs? Or not use open wifi? Those people?make Not sound like an ahole but...How's that working out for ya?
I think ya might be able to argue that if you're going to add friction (e.g., Yubikey) you're also creating a great senses of seriousness. That sense of seriousness is seriously lacking.
All that said, the UN + PW idea is too weak. We need something that's up to the threat AND is also appropriate to the risk of loss. Best I can tell, as a general mainstream rule, we're not even close to that. It's 2017? Really?
The attacker who gets access to the filesystem of your
phone can almost certainly defeat any "encryption" a TOTP
authenticator would use to protect secrets
Is Android's hardware-backed keystore no good? The documentation makes it sound like keys can't be extracted or used without user authentication.
Do you have a citation for the fact that google authenticator stores the keys in plain text? Furthermore, for the case of google authenticator on an iPhone, any files on the user partition are encrypted anyway, and I know from experience that the google authenticator app does not back up keys to either iCloud or iTunes backup. This should mean you are safe on iOS
If you do a backup (even if not rooted you can use ADB to backup your apps and data,) then you can simply restore the app and data to your new phone and the codes for the device come with it.
If your Android is rooted, you can use Titanium Backup, which we’ve written about before, to take a backup of your Google Authenticator app data. [...] If you have root access to your device, you can actually extract the credentials manually
Well the scenario we are wanting to defend against is an attacker that can remotely (or even locally) exploit/root the phone (see long list of vulns for ios and android that have allowed exactly this). How many of these still exist not yet patched?
Depending on who you work for, someone might just burn a 0-day on you. It all depends on your threat profile.
Putting the secret in a hardware token gives you easy to reason about assurances a mobile phone OS vendor can't ever offer.
Also this means when you get a new phone, you just install app and tap key. No setup required.
No, that's a scenario you want to defend against, and I'll remind you again that if you're dealing with attackers that can exploit your computing devices directly, the tokens are pretty much cosmetic. If you have an insecure phone and you actually use it like a smartphone, you're boned no matter how many security tokens you've got attached to your key ring.
When we work with lawyers, reporters, and NGOs, what we find are people with much more urgent security problems. They're one carefully worded email away from giving their entire email account away to a 25 year old in Estonia. They aren't worried that their phone is about to get owned up --- mostly because that isn't going to happen, but for other reasons too.
Real targets are going to be compromised for 3 reasons:
1. They're going to be phished out of losing their credentials.
2. They're going to share credentials between sites and lose them in a breach of one of those sites.
3. They're going to click on an attachment and lose their whole computer to an attacker.
The U2F/TOTP stack this post recommends nicely addresses (1) and (2), and nothing anyone on this thread is talking about addresses (3). I'm not sure why we're spending so much time considering (13).
I think the other part of this is that for these people (and probably most people) losing access to your gmail account is a catastrophic event.
That access can be lost because of an attack or by losing the keys. The former is actually much less likely than the latter so mitigating in favor of it instead doesn't make sense in this threat model.
Any device that generates TOTP tokens needs the secret key available by design. Your could read the source code or spec sheets but an easy way to prove this by backing up Google Authenticator via Titanium Backup and restoring it to a new device. Now both devices generate the same codes.
There have been plenty of iOS exploits as well as Android and everything else. Phones have a lot of attack surface and are not a reasonable place to store 2FA of any kind, IMO.
The separate hardware TOTP device never exposes the private key to system memory or disk at all. Even if your phone was rooted by a remote attacker, they could not generate tokens.
Likewise even if someone physically stole your unlocked phone and your pin-protected key... you are still in pretty good shape.
With a hardware TOTP device is just a "viewer" for one code at a time, as generated by the token.
For virtually all users, their iPhone is in fact the most secure computing device they own. It's meaningfully more secure than a computer running a desktop operating system. If we're talking about protecting applications running on a desktop OS, the idea of keeping things off the phone because "phones have exploits" is pretty silly; in that threat model, the desktop is also owned up, and with it the email account --- it's now secured solely by a cookie in your Chrome cookie store on your compromised desktop!
Yes the desktop is just as bad as the phone. This is why I don't alow any of those devices have secrets that could be used without me being physically present to access data of users I am responsible for, or my own.
You are coming at this from a threat profile of joe individual user. Okay, point taken.
I am talking about the perspective of trying to take every reasonable step to reamin secure while being targeted by skilled adversaries who have a lot to gain if they succeed. It is not that much extra work to reduce attack surface so much further than TOTP-generator-on-a-phone offers, so why not teach anyone best practices that will listen?
Say I have 30+ TOTP secrets in my mobile phone app, and also my password manager. Everyting from Gmail to my AWS root account.
If the TOTP secrets are on my phone and an attacker compromises my phone... they get -everything-. If I am using a hardware token for TOTP and I quickly expire the sessions of all really important things I don't log into often, like AWS... then an attacker only gets a slice of the farm instead of the whole thing.
What this buys is us is damage control and a much clearer picture of what an attacker could of accessed, and what they probably could not of because no cookies or secrets were available to memory or disk at that time. I can assert -maybe- this one token was phished, but that none of the others were at risk.
If the attacker is on a phone with Google Authenticator, they an just generate all the codes they want for every service. We lose the whole farm.
You keep doing this. I didn't say "desktops are as bad as phones". I said "phones are far better than desktops".
I promise you, my security requirements are as stringent as yours are. My 2FA stack is Hardware U2F, Software TOTP, and physically secured backup codes. That's what I recommend. You keep suggesting that this stack is inferior to yours, and I keep explaining why it isn't and why the threat model suggesting to you that it is is incoherent.
Say we each have TOTP for say 20 accounts and our password managers on our phones with the credentials for them as well. We are system administrators with access to piles of PII. Account password resets require 2FA so email alone is not enough to spider to other accounts.
Both our phones have been rooted and are accessible by a remote attacker because some "coworker" sent us a new beta app that was in fact malware.
In both cases the attacker has all our passwords to all apps via our password managers. That is lost.
We each are logged into 5 of these services and the attacker steals the cookies. Those are lost.
Now what about the remaining 15 services we are not logged into? Things we don't log into super often but some of which ar quite important like AWS root credentials.
In your case, the attacker goes and opens the Google Authenticator sqlite database and gets every TOTP secret you have in plain text. You just lost all 15 remaining accounts.
In my case those secrets exist on a hardware token and can't be accessed at all. If I catch my intruder at this point I can be reasonably sure those remaining accounts were not impacted.
Hopefully this clarifies the wider model I am working from.
If only there were a similar guide to getting gpg agent working with the yubikey stored gpg keys and ssh. I've done it, but for the life of my I couldn't tell you how as it was mostly just trying magic incantations of things until it started working.
I also will be adding an alternate "quick ssh setup" guide via PKCS#11 flows to just store an existing ssh private key. Still I think GPG is the way to go in general given all the other use cases it opens up.
Please file issues with anything you want to see! I have a lot of unpublished content I can get polished/up if people care.
"You should disable any other keys that aren't backed by a security token" ... why? You don't need a security token to physically secure a backup key; just put it on a USB drive and stick it in a safe (or a sock drawer).
Security tokens are a nice little bonus for security, and they're a major corrective for the kinds of real-world attacks that screw real people over, like phishing (and dumb passwords). But they're pretty marginal against the kinds of attackers who will target SSH keys. Don't get fetishistic about them; at bottom, for serious systems security, they're mostly cosmetic. It feels good to say that all your SSH keys are held in secure devices, but it doesn't mean much.
It means a lot because I have to physically touch the device every single time I ssh.
I can even do agent fowarding taboos and know an attacker can't go creating new connections on that agent without a physical action from me each and every time.
Compare this to how ssh keys are normally used. You use it once, type in a keyloggable passphrase, and the key is unpacked plaintext into system memory for, in most cases, the rest of the time the system is booted.
You could invalidate the passphrase after every connection but this puts an unreasonable amount of work on the developer.
Simply tapping once for each connection and having no way for an attacker to avoid that is a great middle ground, imo. Particularly for high level production keys.
You're making a comparison to an example I didn't cite. I'm not saying that you should have software-only SSH keys on your computer alongside your Y4 SSH pubkey.
I was mostly responding to: "But they're pretty marginal against the kinds of attackers who will target SSH keys."
I am arguing it is not all that marginal. If the only non 0-day way into production is via ssh to a bastion host with a touch-based hardware token then their lives are more than marginally harder than an on-disk key.
It's not as good as having a hardware key, but they still need your username and password and pwn your phone. That's a lot of trouble. So for most people, TOTP software is good enough security.
I agree mobile app 2FA it is probably good enough for most people. This article however is about using a hardware token for login.
If you have a need for hardware tokens, use them end to end. Using a hardware token and having a less secure backup method means you are only as secure as that less secure backup method.
That simply isn't true, because the hardware token defends against phishing attacks --- in fact, that is the entire reason why U2F tokens exist in the first place. It's literally the motivating use case for the standard: experts with code generators were still getting phished.
So, when you have the token handy, you use it, and you're not exposed to phishing. When you don't, you use the mobile app, and you're exposed to phishing (but not to weak passwords and breaches in sites). It's not complicated, unless you think the token does more than it really does for your overall security.
I will grant you the phishing use case, and that one is relevant to average users. I admit I mostly work with infra folks that would not easily be phished, but might have one of their devices compromised unknowingly rendering phishing moot.
TOTP is a mess in regard to phishing but if we have tools to avoid some of the problems while we are stuck with it, I feel they are worth mentioning.
Particularly for people savvy enough to purchase hardware tokens for personal use.
Authy has the same core issues because the problem is they have to store the secret key in plain text somewhere for TOTP to work. Worse: it is closed source and does not allow itself to be easily audited so we don't even get to know for sure where the key is stored and how beyond what the docs promise.
Security is hard enough when everything is open source. Closing foundational security tools so only a select few biased individuals can deem them secure on a deadline is never a good plan.
Yes. But those concerns don't really matter. Just use whatever TOTP application you're most comfortable with, and, because even experts can be phished, try to use the security key as much as you can.
Yubico Authenticator and cryptostick.oauth are the only open solutions I am aware of that one can easily verify don't ever expose your secret key.
Both of course assume you have the secret key on a Yubikey, Nitrokey or similar.
Any app-only TOTP solution has to expose your private key somewhere by design and thus are best avoided in favor of hardware-backed solutions when possible.
Google Authenticator stores the TOTP secret in plaintext on your device where the potential exists for it to be stolen. An adversary that exploits your phone can generate TOTP tokens as they like and ignore the fact you have a hardware token. If you are going to use Google Authenticator it is your weakest link and a security token buys you no added security, only ease of use.
The typical goal of a security token is to be able to assert: "No one can log into my account without this physical device or an offline backup token from my safe"
To acheive this consider a device with built-in TOTP support in addition to U2F. All current Yubikeys fit the bill here as as well as some Nitrokey models. Desktop or Android users can use the either USB or NFC devices but it is worth noting that iOS lacks support for either which means you would need a desktop or Android device to fetch TOTP tokens for an iPhone.
You can use the open source "Yubico Authenticator" apps to store your TOTP secrets in your key alongside your U2F secret. Now both methods use the same hardware and your phone/computer only get handed OTP codes from the key if present, but can't generate them itself.
Added bonus is now you can now use security token backed login even on the majority of sites/browsers today that lack U2F support.
Extra bonus is these keys can be used for ssh without any server changes. Security token all the things :)