We need WebAuthn yesterday.
Also where is the password to your Authy account?
And where are your 2FA backup codes?
In essence someone has to both get my 1Password password, 1Password secret key and either compromise my phone (for Authy) or my phone number (to recover 2FA backup codes via Dropbox SMS recovery), or my computer (for direct Dropbox access). But very few organisations have that amount of capability and I have nothing stored in my accounts that is worth that capability. If I had, I would store it behind GPG and a password that is only in my mind.
Also, to lose access I'd need to lose my 1Password secret key or forget Authy password + get logged out of all my Dropbox devices simultaneously. The chances of that are rather slim.
$ gpg -d encrypted-secret.txt | goathgen
Having your TOTP secrets on a unique device means that an attacker in that scenario (access to your endpoint) could steal a single TOTP code for the single site, but wouldn’t be able to steal the seed secret itself.
TOTP as a "something you have" approach to 2FA is entirely dependent on how well the device secures the secrets.
An RSA key's private key is (nearly?) impossible to retrieve. The Google Authenticator's TOTP keys are a bit easier. A file on a laptop is even easier.
Not sure about any apps that take advantage of that yet, but the hardware seems to be there.
If you use krypt.co, you can store ssh and GPG keys on your phone's TPM, as well as a secret key for use with a browser addon to facilitate WebAuthn. So, you can already use your phone as the "thing you have".
All iOS apps essentially do, if they store things in the keychain or even the filesystem.
Apart from that I believe that TOTP keys should be encrypted and that is actually my main issue with the described tool - it stores the keys in plain, in a config file.
That doesn't change the necessity of protecting TOTP keys, session keys, bearer tokens, etc., it's just that your second factor is supposed to be a parallel factor, not an extra lock around your password.
(I have a few things I intend to be survivable across a total laptop compromise, but they're special-case things like credentials that can upload code that will be run by a few thousand people. They're not protected by regular website 2FA. For regular websites, a browser compromise would almost always let you wait until I'm logged in, then disable 2FA and change both the email address and password on the account, at which point it's irrecoverable.)
You could start to approach that on a laptop — make sure you have FDE enabled, use the operating system's sandboxing features pervasively, store secrets using the TPM, etc. but that's a huge amount of work and the attack surface for apps on your laptop is enormous, especially for developers: how many people using a system like the one described are one unlucky npm install away from sending their TOTP seed to an attacker? The equivalent attack requires a system compromise on a phone (which tend to have 7+ figure USD bounties on iOS).
The rate of CVEs an android, combined with the sheer number of manufacturers who are slow about updates or just never deliver any, means that unpatched devices are nothing like the rarity that this statement suggests.
> feel free to read that as “buy an iPhone 3GS or later..."
iPhone 3GS stopped getting patches 5 years ago (https://en.wikipedia.org/wiki/IOS_6). I think it qualifies as really old and unpatched.
Sometimes conveniece definitely comes at a cost, and getting to the point of having an application to 'remember' your passwords (ignoring the fact the passwords can then be completely unique and more secure) and then duplicating the OTP keys (that are meant to be kept on an independent device) into the same application seems like a step to far.. at least for me! Think I'd prefer the inconvenience :/ No offense to anyone else's comments intended!
The scenarios I feel like most corporations want to protect against are:
1. Accidental password disclosure, e.g., c/p password to someone who should not know it.
2. Laptop compromised, e.g., with a trojan, keylogger, etc.
3. Laptop physically compromised, e.g., stolen, or opportunistically available (e.g., locked, but sitting on a desk).
It's #2 that's the problem here, and the reason why you want your OTP on a separate device, or hardware designed for it like a Yubikey. In the case of #2, if your OTP secret is on your laptop, and protected by only your laptop password, someone with remote access only needs to log your activity until they have your password, and then they have both your password and OTP.
Were your OTP instead on a separate device, they would also need to compromise that device; with things like a hardware key, this is by design exceedingly difficult, given that it's a separate device designed to never give up the secret, whereas a filesystem can't really tell the difference. That's the point: it should compromising both factors, separately.
But if you're just keeping your OTP secret on disk, and ignoring #2? You're really only covering #1, and a good password manager will make accidental disclosures rather difficult to do anyways, so it seems like little benefit.
The cool thing is that the key is derived from the trezor's seed, so you can have several backup devices. And you need pin+passphrase not just the yubikey.
Even works with Browserpass and the Pass Android client:
The magic incantation to get the latter set up is something like `ykman openpgp touch aut on` (though I can't recall if it's `aut` or `sig` or `enc`).
2FA means two factor authentication. It's a guarantee that even if someone hacks your password or steals your computer, if they don't also have physical access to the secondary device (phone, yubikey etc), they won't be able to gain access to your account.
If you put the secrets of 2FA into the same computer, you are back to 1FA.
However, you end up being less secure than not enabling 2FA in the first place. This is because when you enable 2FA, they can disable the regular checks that they have for 1FA accounts. It will also be incredibly difficult to regain access to the account if you don't have access to 2FA devices.
Think about this scenario. You use Google authenticator on your phone, and have your banking app on your phone. A thief knows your banking password and steals your phone.
Now replace "phone" with "computer". I don't see how changing the underlying 2FA device changes security.
The only real danger is if that 2FA db on your computer is not encrypted. But again, the same danger exists if you use an unencrypted phone.
I don’t know where you’ve worked but almost every time I’ve been in a discussion that’s an explicit goal. The most common situation being where someone loses a laptop but not a token or phone.
Even in the case where someone gets the phone, note that phones have fairly strong protection against reading private data directly and 3/5 of my auth apps and my password manager have require both the device and a passcode or Touch ID authentication to open so its non-trivial to get either codes or passwords out of a device. On iOS at least it’s been many years since “an unencrypted phone” existed so there aren’t simple ways to get around this which don’t just devolve to some form of “if the CIA/Mossad/etc. target you but inexplicably choose not to hold a gun to your head until you unlock the account”.
Especially since most enterprises own their employee's laptops, but not their phones. The administrator can manage and mandate full disk encryption on the PC. But if the employer offers TOTP for a second factor, they have no control over what device holds those TOTP codes.
Who does that? And why would anyone do that?
Maybe it's out of necessity, because convenience features would be less secure. Maybe there's not much overlap between those concerned with security and those concerned with ease-of-use, except for cases where companies can develop tools that encourage good security practices while exposing the end user to other risks (e.g. password vaults with one entrypoint).
So what you end up with is people gravitating towards bad security practices and using shortcuts because it's a PITA to maintain the good ones. Let's skip using SSL because we need to get work done instead of troubleshooting the dozen possible misconfigurations. I'll use a really short password for my AppleTV account because it's too hard to type the damn thing with the on-screen remote. I'm tired of getting a 2FA code 50 times a day to get work done, so I'll implement a hot key to generate it from the command line.
Seems like ease-of-use for security solutions should be almost as much of a priority as the security implementation itself in some cases.
npm publish . --otp $(bw get totp npm)
export BW_SESSION=$(bw unlock --raw) && echo $BW_SESSION
Edit: Suppose I should add the link. https://gitlab.com/elagost/linux-2fa
I build it for the clockworkpi device, so there were more constrains than a normal desktop system. https://www.clockworkpi.com/
I'm really not worried about my computer being stolen or pwned. I'm more worried about somebody trying to access some service from elsewhere in the world, with a password they would have obtains from another mean.
In which case, OTP on my laptop is sufficient, and it also has the nice property of being a backup of the seeds on my phone, which I can loose.
Yubico provides a Python and C library for this: https://developers.yubico.com/Software_Projects/FIDO_U2F/U2F...
Without being purely "conservative", i.e. it was first done on phones therefore it should stay on phones, what are the actual arguments for one vs the other? What do phones have that laptops don't, and vice versa?
This solution stores the secret in your home directory, which is easily readable by any process that has access to your hard drive. Anyone with brief access to your laptop/workstation can easily copy the secret. If you back up your home directory, anybody with access to your backups has access to the secret.
Again, forgive my ignorance, but do smartphones routinely include TPM chips that Google Authenticator utilizes? Never heard of that before.
Though it could and probably should. (Really, this 2FA CLI app could probably do so as well, most modern laptops probably have a TPM.)
People are away from their laptops and exposed to risk of then being stolen without immediate knowledge of that fact more often than the same is true of smartphones, which makes laptops somewhat more of a security risk.
Though, I thought that now TOTP in general, on any platform, is at best a second choice after U2F hardware tokens.
# expects keyboard shortcut to be super+2
twof=`oathtool --totp mysecret`
xset r off
xdotool keyup --window 0 2 type --clearmodifiers --window 0 "$twof"
xset r on
sleep 0.3 && xdotool key 'Super' &
i'd say that if your main concern is having to type digits on your computer keyboard and you'd like to copy/paste it, KDEConnect can store an TOTP generated on your phone to your computer clipboard, as in 1- generate OTP on your phone 2- C-v on the computer 3- done
$ time otp aws
otp aws 0.00s user 0.01s system 98% cpu 0.015 total
Even back in the 90's Mysql was able to hide the CLI supplied password from the process list. It would be cool if oathtool was able to do the same.
This has always been a brittle and not easily portable approach.
And doesn't protect you from an attacker doing something much simpler: reading your .bash_history file.
Passing passwords as arguments has always been a bad idea.
(Debian maintainer and upstream maintainer are the same person.)