TOTP seeds are only useful if you have the account they are associated with. How would the 2fa app by some random guy discern the identity associated with the seed?
Isn't one issue the display of the codes on the lockscreen? If viewing notification contents there is enabled, it would be problematic if it popped up while you were away from your device to say "Your Google 2fa is 100000 right now". I get that the iOS default requires unlock to view the actual content of the notification, but still, that seems less than perfect from a security standpoint.
Notifications say nothing about the account - just that you got this cool number on the app. And yeah you need the device unlocked to read the notification anyway
But yeah security concerns with it are overblown, the more realistic concern is trusting me personally vs a name brand!
To add an item, you scan a QR code. That QR code usually contains the name of the service and your username. For example, the format of Github's QR code is:
What? The same way that when you look at the 2fa codes in your app you know which account they are associated with. The qr codes people typically scan for that do not just contain the seed itself but metadata for the account associated with it like email address or account name.
I had a similar concern when I used a random app for TOTP on my garmin watch, but was relieved when I considered the point I raised. To add codes to the watch you have to paste a seed into an app, and it couldn't be triggered from a QR code. I didn't realize the scan for 2fa had so much info.
I hash them using werkzeug's builtin utility. "Encrypt passwords" sounded more readable to someone who isn't technical and may not know what hashing is. But you're correct - I should update that. Hashing isn't encryption. Thanks for the feedback.
If it was solely up to the opposing party, then they'd strike down every single expert. The judge still is the ultimate decider on what evidence (including expert testimony) is admissible.
It's the jury that ultimately decides whether the evidence is convincing, even if the judge allows it. The defense will try to cast doubt on the evidence when they make their arguments.
This also reminds me of one of Norm Macdonald's bits. He says (very seriously) that if he were on a jury he would not convict someone on the basis of DNA evidence. "What do I know about DNA?" he says.
> Whatsapp/Signal have built their platforms so they can't read user private messages, so they have no ethical obligations to stop bad things happening on their platform
Why draw the line there? Why don't those platforms have an ethical obligation to build the features that would allow them to stop bad things happening on their platform? Especially if they knowingly developed the current implementation specifically to avoid ethical culpability?
That's a political statement. The position is that one should have the right and ability to communicate with other people in a secure fashion. If you deny this right, and build structures to monitor all communication, when "bad" people take over the government, you end up in a dystopia. Then building any sort of anti-government political movement become very difficult because they can hear whatever you say.
So I am glad that software like Signal/Whatsapp exist that allow secure communication [1]. And I would take the harms causes by them being unmonitored rather than the harm of future dystopian governments. Due to how crypto works, I don't think there is much middle ground here.
[1] I would prefer open source, more community owned platforms take over than these two.
Sorry but you were arguing that Discord should snoop even more into what we're doing for ethical reasons, and now you're saying privacy is a virtue. Do you have a reason to think Discord doesn't already do these things and just doesn't get it right every time?
Phone and email aren't more private than Discord either. Arguably less. Difficult to get a phone these days without buying it on camera. And a phone company will give up all your messages.
There are two competing principles here: (1) Privacy for individuals (from government and non-government entities), and (2) generally do things in a way that minimize crime. Both are good, and generally I want communication platforms to conform to (1) rather than (2).
Whatsapp/Signal are as close to (1) as possible by design, can't actually do (2) at all. Phone/Mail are somewhere in the middle, but quite close to (1). In most countries in the world, there is no mass recording of phone calls or mail, despite it trivially easy to do so. Moreover, due to long long historical legal precedent, I don't think phone/mail companies have any freedom to do things differently. They are pretty much constrained to do exactly what the government tells them to do.
Discord on the other hand, does not respect (1) at all. In fact, it very intentionally records and reads everything for profit. And it hands over any info requested by law enforcement. So, ethically, either they rewrite Discord to respect (1) or they should do (2). I don't think they are. As others have noted in this thread, it is trivial to find servers that are clearly criminal.
One might argue that it is impossible to do so because there are so many servers. My second political position is that if your public platform is so large that you can't effectively moderate it, that is not an excuse. You are culpable. Simply stop your platform from growing past the point where you can't effectively stop bad things happening. You don't have the right to profit while enabling bad things.
reply