Creepiest thing with seeing this ticket (again?) is noticing that the first comment is about that is used to be illegal to write anything with cryptographic security in the US and sell/give it to the outside world.
It works the other way too. A surprising number of countries still restrict the import of cryptographic technology, including several EU states.
The whole thing seemed very silly and theatre-y to me.
I have war stories for days about all times companies I worked for had to have customers pull DLLs from 3rd party sites in order to comply with completely political mandates.
Sarcasm aside, the only way to make sure people get encryption is to make it impossible to restrict the technology. That's how encryption ended up spreading. You don't put disruptive tech on every computer on the planet by waiting for permission.
Unfortunately SCOTUS has continually widened the scope of 2A over the years.
> as a non-American who's flooded with videos of random Americans walking around supermarkets carrying semiautomatic rifles
You need to broaden your news sources; this is by no means common, except for perhaps in a few gun-happy states like Texas.
> You need to broaden your news sources
To be fully honest, I don't think I "need" to anything. My life doesn't revolve around having an accurate impression of the US. The image your country spreads of itself is one of loud angry lefties on the coasts and gun wielding red necks in-between. I'm sure that reality is very different, but don't blame me for the bad country marketing :-)
Geez, the stereotypes people spread!
But yeah, there are vanishingly few places in the US where you should expect to be in the presence of gun-toting civilians while doing something as mundane as grocery shopping.
By the way, the archaic meaning of "regulated" means "properly disciplined and drilled". It did not refer to control or supervision by a state.
That still leaves open the question of what levels of discipline and drilling the (federal or state) government could demand of someone for them to be included in the Militia.
It is already accepted that felons and the mentally ill may be prevented from exercising 2nd Amendment rights, so it is perhaps not inconceivable that there could be minimum and maximum age limits, or minimum numbers of training / inspection days for people to be deemed validly part of the Militia.
Whether any such changes would reduce gun crimes, or increase crimes generally, or be politically viable or desirable, are separate questions.
Imagine if the 2A said this:
> "A well tailored suit, being necessary to a sharply dressed citizenry, the right of the people to keep and wear clothing, shall not be infringed."
Does this mean that the government now has a right to force dress codes on people so that their suits are well tailored?
Also justification clauses have been used in other contemporary laws too:
> Retrospective laws are highly injurious, oppressive and unjust. No such laws, therefore, should be made, either for the decision of civil causes, or the punishment of offences.
(From NH Ex Post Facto Article)
Does this mean that ONLY when the ex-post facto laws are injurious, oppressive and unjust, should that law be unconstitutional according to the NH constitution?
In the absence of the 2A, the government would have the power to ban any weapon (using the same authority they have to ban weapons that are not covered by the 2A today). By contrast, under your proposed fictional constitution, there would be no underlying basis for the government to control clothing generally, so your 2A wouldn't expand or limit the sorts of clothing allowable.
As for your second point, I interpret the "justification clause" as saying that all retrospective laws are ...unjust, and that "No such laws" means "No retrospective laws". The hypothetical of a ...just ex-post facto law is ruled out by definition.
Unlike european countries, the american frontier was a dangerous place where every household needed a gun. Armed citizens existed regardless whether there was a militia or not.
Enshrining this right in the constitution had little to do with frontier safety.
Remember, the original idea of the united states was that it was a bunch of separate governments that were federated. It still kind of is, but the federal government used to have much less control.
> nobody will ever say that Germany or France are a "paese"
Google lists > 200.000 results for "la Germania è un paese".
Which article or amendment of the U.S. Constitution forbids a state to leave without permission of the U.S.?
And good luck finding a good vehicle to overturn that precedent.
The TEU at least provides an explicit process for leaving
"country" = geographic unit, "state" = political unit.
The United States is itself a state, albeit a federation of smaller states.
And conversely, there are states spanning (parts of) more than one country: The UK with North Ireland and many others (perhaps England, Scotland and Wales if you consider them separate countries).
Particularly noticeable during the iOS app store submission process (Android's is somewhat more lax leaving the liability firmly with individual developers)
But I have a more pragmatic approach. If nuclear launch codes were written out on t-shirts I wouldn't be happy about it either. I think the real problem is ignorance. The US's main role after 1945, and the role of the UN, was and is to prevent another world war. Whether by virtue or by ignorance they have been successful, with the notable exception of a partial world war in the Middle East.
Having said that, the problem is ignorance towards technology and knowledge and resentment towards talent or individual ability. It's more general fear towards things they cannot understand, or rather, things they understand that they cannot subvert. But, I don't like to reduce myself to a protagonist's syndrome and I can more or less understand why the US government does what they do.
The only real node of certainty in the whole equation is that individual freedom is where the line should be drawn. And unfortunately for the obnoxious prescriptive types, any human can invent cryptography on their own whilst living in a cave.
If the government only found out that its nuclear launch codes were leaked because it saw them written on someone's t-shirt, I would be unhappy about the government, not the t-shirt.
Also, if the government decided to ban the t-shirts rather than changing the codes, I would be even more unhappy.
Now I use three Electron apps on a typical work day.
Sadly, Electron-based Element is still the best Matrix client by far.
Works even on old and crappy computers.
Back when I liked my messenger app on the Mac.
Apple has tried hard to degrade the experience of their built in messenger.
For reasons unclear to me Thunderbird chose not to go with something like autocrypt.org, but stick with standard PGP and implement parts of their attempt to simplify, which isn't nearly enough to get regular users on board, never mind implement things like forward security, which autocrypt does.
A missed chance from Mozilla, secure email would fit in well with their mission.
Now that most SMTP connections are encrypted with STARTTLS on the wire, autocrypt is not that valuable. At an email server autocrypt can be trivially man in the middled with just a simple script. It ends up being just another encrypted messaging solution that skips the hard but crucial problem of identity.
Added: Autocrypt does not do forward secrecy. Most people want to keep their old emails, so forward secrecy wouldn't add much value:
Better to use a protocol designed with encryption in mind, like Signal, to get forward secrecy, avoid leaking metadata, and have encryption always on by default.
UPDATE: I have been reminded that PGP does not have to be used with email. I meant to say that I used to love using PGP with email, since that is the primary way I have used it. I will not comment on the use of PGP outside of email, since I haven't carefully examined its use in other contexts.
(Open)PGP is first and foremost a flexible packet format (and other specs), and GnuPG is more of a CLI "library" to interface with it -- all of it. You can build something that hides metadata, you can have forward secrecy, and encryption always-on by default with PGP (and GnuPG). You can use it for whatever trust model you want, neither OpenPGP (the specs) nor GnuPG prescribe a certain model. It provides building blocks. It just happens that no good client exists and no more high level specs were written that use it, which is highly unfortunate. The client in this case here used to be "Enigmail" (a wrapper around GnuPG), and now is built-in since plugins are not allowed to be as powerful with the new browser architecture that Thunderbird piggybacks on and this was the only way to bring PGP to Thunderbird users. Also, there are some more modern libraries nowadays for standard use cases around PGP.
Signal was able to move faster since it did not exist, did not try to build things in an open and collaborative fashion, still refuses to work in any open way. Its founder Moxie even openly argues against standards [x]. I am not saying he does not have "a point", but in the long run this will lead to just more silos and ultimately technical stagnation, and for me goes against the ethos of "a public and open inter-net." (and the learnings behind it. 'Those who cannot remember the past are condemned to repeat it.')
That said, I do agree with your comment on a pure end-user level. It still makes me sad to always see this confused and not acknowledged better in places like HN, where people "should know". Technologists can defend and strive for the most promising long-term solution and "proper way to do it", and at the same time recommend "the best of the bad that is currently available". Even if it is confusing sometimes.
Just to give an example, there are plenty of high security use cases that require using smartcards. I'm very grateful that the OpenPGP standard and GnuPG exist that (can be made to) work in such cases. Signal does nothing in that space, rightfully so. But you are comparing different fruit to each other, which is kind of unfair.
[x] and still calls himself an "anarchist". You would think those know better...
I agree that one great weakness of Signal is its centralization, and that decentralization is a great strength of email.
For a decentralized encrypted communication channel, Matrix may become a good option now that it is encrypted by default, but I haven't studied carefully the quality of its encryption beyond that yet.
Regardless, I really do think a new protocol is required for secure communications and that email won't cut it. If Signal or Matrix can't do the job for some applications, we'll just have to try again. I cannot recommend email for secure communications, and in any situation where security matters, I have to prioritize people's safety. In terms of security, the best that is available in the realm of encrypted email is not and cannot be good enough.
> I used to love [A], but I now think [B] is a bad idea.
will lead many people to think that you consider A and B to be essentially the same thing. In the specific case of PGP and encrypted email, this is a category error: you are implying (perhaps unintentionally) that PGP’s only use is for encrypted email.
It abjectly fails at that. It's just awful to interface with.
The constant confusion between OpenPGP (the standard) and GnuPG (one implementation) has led many users to look elsewhere, which is unfortunate, but has also led many developers to look elsewhere, which is just sad.
Yes, trying to work within the existing standardization bodies can be very painful and disheartening, but the few people who try and survive in that space can use every good soul willing to assist.
The OpenPGP community has done hundred times more than Signal for the state of security on the Internet, and I'm sure it will continue to do so. Signal is great, and pushed the limits quite far (and still does), and I'm very grateful that it exists and use it every day.
See https://latacora.micro.blog/2019/07/16/the-pgp-problem.html for more on that. And it isn't hard to find lots and lots of cryptographers agreeing with the thesis.
OpenPGP is used to secure everything from simple messages and email to authenticating OS updates for most servers today.
Should they use something more specialized now that it exists? Sure, but that an argument for PGP not against it; PGP is useful in situations where no specialized solutions exist or are inadequate.
Until something comes along that can cover every situation where OpenPGP is useful, I can't see people stop using it (much to the dismay of the 5 vocal crptologists that keep arguing against it).
At least with PGP email, you can make the argument that PGP sticks around because people don't want to recreate contact lists. But even that argument doesn't apply to update.
I'm suggesting that "seems fine so far" is not effective at evaluating solidity of cryptographical usage.
Well, just use TUF  and in-toto  ;)
For the simple case of "a single publisher publishes update for a single product", TUF is an overkill. Something like signify or seccure will be way easier to set up and use.
And for MOST of the places that it is used, it gets screwed up in some way that makes it not as secure as the people using it thought that it was.
Stop and think carefully about that statement. And repeat it to every person you meet who thinks that they are solving their problems with OpenPGP.
Due to the complexity of the PGP system, there are a plethora of downgrade attacks. Where something that was supposed to be at one level of security can be tricked into doing something much less secure. See https://twitter.com/xmppwocky/status/1291144278953955328, https://mailarchive.ietf.org/arch/msg/openpgp/JLn7sL6TqikUf-..., and https://www.eff.org/deeplinks/2018/05/pgp-and-efail-frequent... for three different examples of such attacks against PGP in recent years.
The second one is just yet another person discovering that the MDC check can be stripped off a message.
The third one seems to be just EFAIL which is not a downgrade or any attack really against PGP.
I didn't know OpenPGP was used to authenticate OS updates for most servers today. Can you give me a place to find out more about that; are you talking about a specific OS?
If you're not using GnuPG, you can pick any format you want! What does OpenPGP add? Everything I can think of is a negative.
If you want an explanation of why GnuPG is dangerous for anything besides email, let me know. But parent post was very specifically NOT about GnuPG- it was about OpenPGP.
What does the OpenPGP format bring to the table, BESIDES HAVING A COMMONLY AVAILABLE IMPLEMENTATION IN GNUPG?
What I got from it is that the strength of PGP is in its flexibility and that it gains that flexibility from being standardized and open.
Of course you can extend PGP (or any protocol really) to include cryptographic handshakes, but that's besides the point.
That makes an even better case that PGP is not much of a modern secure system generally, rather than it just being bad for secure email.
I don't understand what you're trying to say here.
“Have one joint, and keep it well oiled.”
At the top of the thread someone says 'PGP has all this ancient cruft and also isn't good for email/email is maybe unsecurable anyway'. The PGP-enthusiast response to that is 'No problem! You see, PGP is so flexible, you can build anything out of it, including maybe an actually-secure system of some sort'.
This is a fundamentally inadequate response and we've realized it's inadequate and don't really design secure systems like that anymore. In fact, this class of concern has permeated almost all aspects of contemporary software systems design rather than being something narrowly limited to 'crypto algo agility is bad'. Here's an example/illustrative timeline that doesn't have anything to do with 'data at rest'.
1998 Netscape: Behold our cross-platform, cross-language super-toolkit for building super-extensible Enterprise Groupware apps. Also one of them is a web browser!
2015 Mozilla: [belated forehead slap] This is a terrible way to architect a web browser that has any hope of offering end-user security.
Edit: I'll expand the quote from the original thing a bit if it helps:
(Open)PGP is first and foremost a flexible packet format (and other specs), and GnuPG is more of a CLI "library" to interface with it -- all of it. You can build something that hides metadata, you can have forward secrecy, and encryption always-on by default with PGP (and GnuPG). You can use it for whatever trust model you want, neither OpenPGP (the specs) nor GnuPG prescribe a certain model. It provides building blocks.
This is a huge 'nope'. It's, by this point, an uncontroversial, well-understood nope. A known-nope, if you will! The only people who don't seem to think so are PGP people and that, in itself, is rather telling.
How do you do a downgrade attack on, say, an encrypted backup?
That's it. That's the actual arguments to this point. Then you just said nope.
You are not claiming that logic and facts are irrelevant here, are you?
I never said that. You're replying to someone else.
On the face of it, that sounds like exactly the position an anarchist would take...
That's an incredibly... unconventional read on the IETF.
Edit: language fixes. English is not my language of choice for discussing things like this...
If you try to actually build a secure environment within a group that tries to maximize security while getting real work done you find you want to be encrypted by default at least with each other. Signal is pretty suboptimal for heavy volumes of messages. If you and I have three threads going they are all jumbled together. If I want to send to more than one person I have to whip out my phone and form a group and name it.
PGP is imperfect but with the right settings and defaults it is far better than having email default to clear text. And in any long term endeavor with more than a few people you will find you want email.
Signal is great for what it does. It is not designed to be a high volume working tool like email though.
It's not possible to make email secure, the flaws are on the protocol level. To fix it, you would need to change it until it is no longer email.
If you’re saying Signal is strictly more secure I agee 100%. It’s just not suitable for using for large amounts of comms within a group. I wish they’d improve it and have even detailed features I think they should add (I made an HN thread when they got that donation from the whatsapp guy).
For now it’s not realistic to expect people to know how to put ALL sensitive comms in Signal. You need to build an environment where as many channels as possible are secure by default. Setting everyone up with GPG and using it by default actually works in group settings and is much better than not doing it. The “just use Signal” meme is wrong. Because you can’t. Not at high volumes.
(Also, metadata is a lot less important in the context of a group that openly associates with one another as a great many do. It does suck that subject lines leak. But better to encrypt the message body than throw up hands and give up because no perfect substitute exists. One learns not to put much info into subject lines. You can always just not use them. Signal does not have them.)
How could Signal (or other client-server protocol) not leak metadata? It is true that OpenPGP leaks more metadata than necessary (e.g. Subject), but seems to me that any efficient message protocol needs to leak at least three most important metadata - source, destination, time.
One could avoid leaking destination by broadcasting encrypted to many receivers (when only the true one can decrypt it) and therefore server does not need to know true destinations, but that is rather inefficient.
I don't know if they can do anything about destination or time, but even hiding the source seems like a significant advancement.
The post mentions some kind of "short-lived" pseudo-sender, which is vulnerable to the same metadata analysis.
Encrypting the receiver is a lot harder though. It will probably involve dropping off the message at some central location and some very fancy cryptography. Secure multi-party computation  will probably be involved. I don't know if it can be made scalable though.
That's the point. If the sender is encrypted, what's "the destination"? The IPv4 space? A random ID that represents the sender? How does that change the metadata angle?
`Have a nice day. Regards, sobani`
and see that I'm the sender.
If I'm worried about leaking my IP address, I can use something like Tor.
I'm confused why you think the sender needs to be known in any way to allow bytes to be delivered to your mail server.
PGP can be used to encrypt headers such as Subject. Mail addresses can be temporary and pseudonymous much more easily than phone numbers. One could imagine building something like "Signal" on top of PGP and "email" that has all the "metadata hiding" properties of Signal, and more, e.g. by introducing delays and random relaying order on SMTP level.
Deltachat is an example of an open project in that space: https://delta.chat/
To present an iron-man argument instead of a straw-man, imagine they were comparing Delta Chat to Signal, and pointing out that Delta Chat does encrypt subject headers, and that Signal users typically don't use temporary phone numbers (which are harder to obtain than temporary email addresses).
If you want, you could try comparing Signal's proprietary network versus the global SMTP network in terms of how secure they are against traffic timing analysis.
It's not possible to make Signal secure, the flaws are on the protocol level. To fix it, you would need to change it until it is no longer Signal.
You have your email set up to "default" to PGP? Can you say more about what you mean by that?
Transport-level security and authentication of email content is a perfectly valid use-case for an organisation when protection against third-party interference is desired. They don't need to worry about forward secrecy, they just need attachments to be transmitted in a legally-compliant manner.
For example each month HR email me my payslip as an encrypted attachment. I decrypt it and save locally. They just have a batch job that encrypts for each user and sends. They don't have to worry about who uses which IM client. They don't need to care if I self-host or use Gmail, because their ligation is simply to keep the information secure in transit.
You are also too keen to support Signal's use of phone numbers as identifiers. That's a design choice, instead of using client-managed identifiers, and makes it unsuitable for organisational use. Whose phone will we use to send the deposition to the court... and who in the court will have a phone with Signal on it? Email by contrast is universal and integrates well into organisational processes without dependency upon individuals.
This makes sense to me. I've been trying to get friends, family and colleagues to encrypt email (hell, even signing would be a step!) for about 3 decades, now, and have basically thrown in the towel. So anything that affords a small toe-hold to begin the painful process of building the necessary network effects for the idea of encrypted email to gain traction is a good thing.
With PGP there is zero chance of anyone decrypting my data unless they have my key and password, which isn't uploaded anywhere and no apps have access to it.
For me, open-standard-ness and federated-ness are two measures that i consider even more important than cryptographic security. So communication protocol that does not satisfy either of these have little value to me, i would rather use e-mail.
And personally, I think the points made it the linked article are weak.
Apps like ProtonMail or Tutanota may have an impact on encrypted email. If both sides use ProtonMail, communication is end to end secure. That’s also the case with encrypted messaging. In both cases, copying outside an incompatible platform may be insecure. At least, email address is more private than phone number.
If both sides use [the same provider], it's not mail anymore, it's something internal. That is the fundamental issue of ProtonMail/Tutanota/any service provider that pretends to solve end-to-end encryption in email: without standards, it's a proprietary system. Today the only viable path towards easy E2E encryption in email is Autocrypt. AFAIK only Posteo is working towards including it.
Ummm... they do.
Reminds me of a tiny blog post I put up years ago about the future of archaeology. Forgive the silly site title.
I am not so sure of that. Even time stamps cannot be trusted very much.
For example, a lot of photo cameras, audio recorders and other such devices still don’t have a small extra battery for keeping track of time. So every time you run out of battery, or in the case of wall-connected devices you unplug them from the wall, the clock is reset and time is set to some date that the device manufacturer decided to use as the epoch for their device.
And most of these devices don’t have a GPS receiver in them either, so you need to manually set the time. And only some of them will prompt you for it directly while for others you need to remember this and go in the settings menu and adjust the time.
So I routinely get time stamps that are things like 1st of January 2008, or 2015 or what have you. And sure if you notice right away you can correct it. But even when you know you sometimes get tired of that kind of stuff so you just leave it like it is.
And that’s just at the recording stage. Then you transfer files between systems and sometimes you get the original time on your files and sometimes you don’t, and even if you did they might be wrong in the first place as per above.
And all of this is for someone who knows how this stuff works and who tries as good as they can to have proper dates on files and to keep them that way. Then you have people who don’t know and don’t notice.
And then you have people who willfully modify metadata.
And then there is the data itself. We live in a post-truth age they say.
How do you know in a thousand years that the data is from when it says it was from and furthermore that there is even a grain of truth in the data itself?
I am sure there will be competent historians that are able to learn a bit of our time from our data. But I don’t think they will have it easy.
And on top of that I don’t think data can really tell the future what life today is like. It’s hard to put in words what I mean by this last point so I won’t try to say any more about that.
We've been doing electronic calendars for how long now? Why are we still using a paradigm from paper based calendars? At the beginning of a month I can see three weeks ahead, but at the end of the month I can see three weeks behind. It frustrates me no end that this is still a thing. It reminds me of the early days of Google maps when they were no better than paper maps, but now we can rotate the map, zoom in and out etc. But calendars are still no better than paper calendars. Apart from the one in Thunderbird.
The “dead simple solution” is to just run the Signal protocol over SMTP, although I’m sure it’s possible there is a better design if you were to think about the specifics.
It's a much better user experience, and honestly I'm surprised that no enterprise orgs have adopted it, because it would probably be cheaper than all this phishing training.
1. At least until recent, Microsoft implemented it as blocking code in the UI thread – open a message and Outlook won't paint until it can verify the cert, access your local key store (hope your token is in a USB port which is 100% reliable), etc. If you thought “Does that mean that revocation checks block the UI until a network process completes?” you're sadly right.
2. Adoption hasn't been enough to be able to ban untrusted senders. This could still be quite useful for, say, a hard requirement that *@example.com must have signatures but it doesn't help with really common phishing tactics like pretending to be a vendor, business partner, etc.
3. You definitely need to get serious about key management because your ability to read your old encrypted email depends on retaining those keys. This is obviously not impossible but it has a cost when you think about the need to store them securely in a manner which can be restored after the inevitable system failures and compromises.
Preventing forgery of @example.com is nice, but sadly doesn't matter that much, because a message fro "Your Boss" <firstname.lastname@example.org> is just going to show up as Your Boss in the UI with most clients these days, amd Enterprise oriented clients are worse than the norm. The fastest way to see the actual addresses is to just press reply, which is irritating.
From: "John Q Boss"
Subject: "the blog is down"
Can you check <a href="http://0-dayfuntimes.example.org">http://blog.example.com</a>
Yikes. I should probably be surprised, but I guess I'm not. This sort of thing goes a long ways towards explaining all sorts of other oddities or using Outlook.
They seem to be extremely hesitant to change it, which I assume is due to the corporate plugin ecosystem.
Presumably we're discussing how an open protocol addition might gain any traction at all over walled garden protocols like Slack -- and in those cases you want to maintain as much compatibility as possible.
"Federated SecureEmail 2.0" would be dead-on-arrival, where "Secure Client on top of bog-standard email" is ever so slightly less DOA. You get to re-use the existing identity/routing system, existing servers, existing authorization, etc.
Why not just toss the whole shebang and rebuild it below that layer? Signal Protocol seems to be pretty successful here.
>Presumably we're discussing how an open protocol addition might gain any traction at all over walled garden protocols like Slack
No, I'm asking how a new open protocol can be built on top of email in a way that maintains strong backwards compatibility while offering strong security guarantees, like end-to-end encryption. I don't think it's possible.
That's exactly what Autocrypt is doing. They're still at level 1, ie opportunistic encryption: try to encrypt if possible, default to plain-text if not. That's 100% backwards-compatible.
What we all want is a system where we are sure that messages can't be sent in plaintext. The only way this can happen is if MTAs actually read messages and check if encryption is applied; if not, reject it. That's something that can happen but by definition it can't be backwards-compatible.
You can't have strong security guarantees with backwards compatability, because backwards compatability requires plain text sending.
Unless you're OK with limited guarantees like the UI will tell you when the mail could be sent in plain text. Or if the UI tells you the message will be end to end encrypted, it won't fallback to plain text, etc.
>Unless you're OK with limited guarantees like the UI will tell you when the mail could be sent in plain text. Or if the UI tells you the message will be end to end encrypted, it won't fallback to plain text, etc.
This would require changing all the clients.
Not really, the old clients would never send (or receive) end to end encrypted messages, so they don't need new UI to tell you that. That's the cost of unbounded backwards comptability.
If you want to have 100% coverage of email users, changing all clients is part of the deal. But, uhhh, good luck with that. Server based standards are an easier lift --- you could build a standard to require hop to hop encryption on mail and expect that to plausibly gain enough adoption to use over a managable time frame. Of course it would be trivial for a hop in the delivery path to subvert that and remove the request, and all hops in delivery would still have message content access; but if people like it, a large majority of servers might support it in 5-10 years.
You're describing TLS, and this is already happening.
Published November 2019. I didn't see any data on support, and hadn't heard of it before looking just now.
People use email because everyone uses email; you're not going to get people to change how they use email entirely. Nobody cares that it's running on SMTP, they care that they can email everyone they know with an email address.
I don't feel confident enough to actually try this project yet, but at least I wrote what my ideal chat app would be, oh and came up with the fancy name 'chattaur' for it.
1. Write the message.
2. Encrypt the message.
3. Paste the encrypted message into the email client.
Your odds of accidentally sending a message in the clear are zero.
As the number of people on the email chain approaches 2, the chance of someone accidentally copying+pasting the entire email chain decrypted into a reply reaches shockingly high levels.
A painful manual process where the simpler slightly-less painful path works, but results in insecurity happening, are not a good way to make sure security happens.
Heck, people (myself included) regularly mess up Reply and Reply All :-)
Still, doing so solves the transport and persistence problems you would otherwise have to deal with.
I'm pretty sure the dead simple solution is to either use SMTP or Signal and not a frankenstein's monster of both.
36 comments, 1 day ago: https://news.ycombinator.com/item?id=24501872
226 comments, 2 months ago: https://news.ycombinator.com/item?id=23864934
51 comments, 12 months ago: https://news.ycombinator.com/item?id=21197327
* I don't know
* I do NOT trust
* I trust marginally
* I trust fully
* I trust ultimately"
This is a real pop-up I got the last time I tried to use PGP with Thunderbird.
If people still get regular pop-ups like these, I don't think PGP will ever be popular.
They might have switched to PEP (pretty easy privacy) that uses TOFU (trust-on-first-use) so maybe this is thing of the past, but I don't know.
TOFU works if you actually do it properly. That means verifying the fingerprint of the key you just received out of band once time. Now you can trust that key. How many people do this? Even IT professionals just routinely say yes for new ssh keys without checking. That's not TOFU, that's just trusting that you are not under attack which is not security.
TOFU is actually just a degenerative form of web of trust anyway where you just automatically say "I do not trust" to the above so that each new trusted key needs to be established individually. But it does make it easier. Easier to not be doing security at all.
(Again, that was before PEP/TOFU. In my former company, we had PGP as a company mandate. Everyone hated that, and it never ever worked properly for group e-mails, no matter how hard we tried to hack it. Not sure what is the experience now.)
I do agree that the marginally and fully options are useless. Wonder if adoption would be better if you just removed them, and were left with:
- I don't know what PGP is (exit, opens documentation or a tutorial)
- I definitely distrust this key (because it has been marked as malicious in the wild)
- I trust this key (because I have verified it either in person, or through a proof like Keybase)
It's not "how do I trust this person key", but - "how much do I trust this person to sign other people's keys ".
As in "how much do I think that other person is careful in checking other people's keys". Not judging his key, or your ability to check keys. You are judging the other person capability to check another, third person keys.
This is important for PGP "web of trust".
I have some serious feels for the guy, that had to have been a frustrating experience.
See https://latacora.micro.blog/2019/07/16/the-pgp-problem.html for why we shouldn't be using PGP in 2020.
However every cryptographer that I know to trust who has bothered to comment on PGP has said to not use it. For example see https://www.schneier.com/tag/pgp/, or https://secushare.org/PGP.
Therefore I won't be using PGP. No matter how much you like it.
I tried to use S/MIME. I really did. Went to those sleazy web pages of companies that issue certificates, was told I'd need to pay for various certificate products. Tried a free option. Tried to load the cert onto my YubiKey, tried using it in Apple Mail…
The conclusion: it is by no means easy or well supported, and requires you to deal with those sleazy cert-celling companies.
It works quite well within groups. If you use it every day it’s a total non issue. The problem is people who set it up, forget about it, and finally get encrypted message two years later. Of course that is then confusing to try to remember how to use.
Edit: I mixed Vertical and Wide in my original message. I meant the vertical view is unusable. I've edited it.
Above the three panes I simply have the message filter box and above that is the main toolbar (get messages, write, address book, view drop down, quick filter button).
Simple but works well for me on 1920x1080 or higher (and dealing with about a hundred or so emails a day).
It fails to associate my accounts with my keys in my keyring, so I try to import an exported key. Whenever I do this, it gets stuck in a loop asking me for the passphrase for an old, revoked key :( Even after I deleted the revoked keypair -- which I know I shouldn't -- it's refusing to cooperate.
So it looks like I've actually lost this feature by upgrading...
It also reminds me of the infamously long delay on JIRA-1369, the 20 year old ticket.
At least it's neat software can be used for 20 years.
I remember some years ago a security ticket was resolved after someone on reddit celebrated its decennial.
For example, bug 41929  is only 6 months younger. Briefly, you cannot have two different IMAP accounts in Thunderbird that have the same username and host but different ports.
16 years and still pending! All bets are off for this dark horse