Hacker News new | past | comments | ask | show | jobs | submit login
21 years after the request OpenPGP support gets added to Thunderbird (bugzilla.mozilla.org)
690 points by janvdberg 10 months ago | hide | past | favorite | 269 comments

> For reasons associated with U.S. export restrictions, no cryptographic security of any kind is likely to be included in the original sources


Creepiest thing with seeing this ticket (again?) is noticing that the first comment is about that is used to be illegal to write anything with cryptographic security in the US and sell/give it to the outside world.


Any signatory to the Wassenaar Arrangement, which includes the entirety of North America, Europe (including Russia), Australia, India, and Pacific Asia (minus China) must consider cryptographic technologies to be munitions for the purposes of export. Now, these restrictions have been considerably loosened to the point that the export isn't really controlled, but international law still considers it a munition. The US is hardly unique in this regard.

It works the other way too. A surprising number of countries still restrict the import of cryptographic technology, including several EU states.

I used to be involved in building and shipping research robots (from Canada) and I remember we ran into this a few times with a bog-standard industrial wifi radio that for some reason was under ITAR. Interestingly, the manufacturer of the radio was set up to ship it directly to our customer, we just couldn't integrate it into their robot and ship it from our facility. So they had to put on the radio themselves.

The whole thing seemed very silly and theatre-y to me.

Ah yes...the efficiencies never stop coming when you forced to skirt government mandates about tech they know almost nothing about.

I have war stories for days about all times companies I worked for had to have customers pull DLLs from 3rd party sites in order to comply with completely political mandates.

I had a professor tell me a story about a small tech company he worked at way back when, where they actually smuggled their POS terminal system by fishing boat out of the US and into Canada to sell outside the US to get around export controls even though the algorithms they used were widely known.

It'll be even more efficient soon, when the backdoors come.

Is establishing a HTTPS connection internationally exporting a munition, using a munition, or none of the above?

As far as I understand current interpretation, hosting a dowload of a piece of software that includes code capable of establishing a https connection (e.g. bundling a tls lib), on a US server, that can be dowloaded to a computer outside of the US, constitutes munitions export.

TIL github is one the biggest international munitions dealer

Relevant xkcd: https://xkcd.com/504/

For those that don't click through, it repeats what is a fairly cogent argument. If cryptography is classified as a munition, then there should be legal room to argue we have the right to it as provided by the 2nd amendment.

Congrats, you don't get encrypted because you're not a member of a tightly regulated militia.

Sarcasm aside, the only way to make sure people get encryption is to make it impossible to restrict the technology. That's how encryption ended up spreading. You don't put disruptive tech on every computer on the planet by waiting for permission.

I see your point, but as a non-American who's flooded with videos of random Americans walking around supermarkets carrying semiautomatic rifles, I'm not sure what you mean with "tightly regulated militia".

It's a joke. The text of the 2nd amendment is slightly ambiguous. Many people (myself included) believe that the intent of it was to protect citizens' right to arm themselves, but only in the context of being a member of a state-run/regulated militia.

Unfortunately SCOTUS has continually widened the scope of 2A over the years.

> as a non-American who's flooded with videos of random Americans walking around supermarkets carrying semiautomatic rifles

You need to broaden your news sources; this is by no means common, except for perhaps in a few gun-happy states like Texas.

Thanks for the context, that joke about the 2nd amendment's widening scope had indeed gone over my head.

> You need to broaden your news sources

To be fully honest, I don't think I "need" to anything. My life doesn't revolve around having an accurate impression of the US. The image your country spreads of itself is one of loud angry lefties on the coasts and gun wielding red necks in-between. I'm sure that reality is very different, but don't blame me for the bad country marketing :-)

Texas isn't like that either. I've lived in Texas for almost a decade and I've never seen someone open carrying a long rifle at the supermarket.

Geez, the stereotypes people spread!

Sure, I know it isn't. But it still depends on where in Texas. If you're in Austin, yeah, you're probably not going to see people toting guns in supermarkets all over. If you're in Dallas or Houston, it'll be more common, but still not that common. If you're somewhere like San Antonio or Corpus Christi, it'll be even more common.

But yeah, there are vanishingly few places in the US where you should expect to be in the presence of gun-toting civilians while doing something as mundane as grocery shopping.

I have a different perspective. I think that the scope of the 2nd amendment has narrowed over the years. In 1776, private citizens owned every kind and sort of weapon used by the military. Ordinary people owned cannons, were instructed to put cannons on their private ships to defend against pirates, and owned the same sorts of muskets used by the army. The modern equivalent would be buying tanks at Walmart for cash and not registering the purchase nor requiring a background check.

By the way, the archaic meaning of "regulated" means "properly disciplined and drilled". It did not refer to control or supervision by a state.

> By the way, the archaic meaning of "regulated" means "properly disciplined and drilled". It did not refer to control or supervision by a state.

That still leaves open the question of what levels of discipline and drilling the (federal or state) government could demand of someone for them to be included in the Militia.

It is already accepted that felons and the mentally ill may be prevented from exercising 2nd Amendment rights, so it is perhaps not inconceivable that there could be minimum and maximum age limits, or minimum numbers of training / inspection days for people to be deemed validly part of the Militia.

Whether any such changes would reduce gun crimes, or increase crimes generally, or be politically viable or desirable, are separate questions.

The government cannot demand anything. The well-regulated part is a justification clause.

Imagine if the 2A said this: > "A well tailored suit, being necessary to a sharply dressed citizenry, the right of the people to keep and wear clothing, shall not be infringed."

Does this mean that the government now has a right to force dress codes on people so that their suits are well tailored?

Also justification clauses have been used in other contemporary laws too:

> Retrospective laws are highly injurious, oppressive and unjust. No such laws, therefore, should be made, either for the decision of civil causes, or the punishment of offences. (From NH Ex Post Facto Article)

Does this mean that ONLY when the ex-post facto laws are injurious, oppressive and unjust, should that law be unconstitutional according to the NH constitution?

Great points, thank you, but I think I still disagree.

In the absence of the 2A, the government would have the power to ban any weapon (using the same authority they have to ban weapons that are not covered by the 2A today). By contrast, under your proposed fictional constitution, there would be no underlying basis for the government to control clothing generally, so your 2A wouldn't expand or limit the sorts of clothing allowable.

As for your second point, I interpret the "justification clause" as saying that all retrospective laws are ...unjust, and that "No such laws" means "No retrospective laws". The hypothetical of a ...just ex-post facto law is ruled out by definition.

> the intent of it was to protect citizens' right to arm themselves

Unlike european countries, the american frontier was a dangerous place where every household needed a gun. Armed citizens existed regardless whether there was a militia or not.

Except it was also seen as a civic duty for able-bodied men to join the militia. If we're talking late 18th century, the Venn diagram of gun ownership and militia membership has a very large degree of overlap.

While that may be true, that was not the intent of the 2nd amendment, which was there to ensure that regular citizens were both involved in the protection of the country's interests, and could act as a counter (by force, if necessary) to their own country if it decided to try to grab too much power and become tyrannical.

Enshrining this right in the constitution had little to do with frontier safety.

The US is all about selective enforcement, and the undesirable hacker type and their unpleasant "cryptography" is likely a higher priority for munitions enforcement than an irritable white guy with an AR-15 at the supermarket, because only one of them actually threatens the status quo.

But is the guy wearing a mask?

Cryptography and the AR-15 may both be classified at weapons, but you can't actually shoot someone with cryptography. I suspect that may also play a role when it comes to enforcement...

Well, not anymore as it seems. But the hacker type might now share common interests with the AR-15 guy.

“and on that day the ACLU and the EFF cited DC vs Heller (2010), and a crack appeared in the heavens and a loud voice spoke, saying ‘oi! wot’s all this then?!’”

There is no definition given for what "well regulated" entails. A bunch of nerds on the internet can certainly form their own militia to practice with crypto munitions.

The the 2nd amendment is irrelevant here. That amendment only applies within the US and this is about export controls. Pretty much every US constitutional protection disintegrates once the issue becomes international.

The joke is pointed towards legislation that would make encryption illegal in the US.

Side note: don’t call the countries in the European Union “states”. They’re sovereign countries that have committed themselves through treaties to the Union, not a US like government body

I'm pretty sure state meant a sovereign government before the united states existed.

Remember, the original idea of the united states was that it was a bunch of separate governments that were federated. It still kind of is, but the federal government used to have much less control.

The European Union (EU) consists of 27 member states.


I don't think it's incorrect to call them states. They're nation states. They are regularly referred to as member states.


Languages are strange in very different ways. The Italian word for country is "stato". How do we translate state as in "NY is a state of the USA"? Again "stato". We have "nazione" for nations but really nothing for countries. We do say "paesi esteri" for "foreign countries" but that's almost the only occurrence with that meaning. A "paese" is a town, so nobody will ever say that Germany or France are a "paese".

Actually, "paese" is the normal translation for country (optionally with capital "P" if one wants to avoid ambiguities with the "town" meaning)

> nobody will ever say that Germany or France are a "paese"

Google lists > 200.000 results for "la Germania è un paese".

It's amazing how contrarian HN can get. The EU is literally composed of member states in its formation documents

Country means “sovereign state” - no-one’s suggesting EU member states aren’t sovereign nations in the sense US states aren’t.

U.S. states are sovereign states, just like EU states are. They United States government itself is also sovereign, in a sense that the EU government itself may or may not be. (Most) Americans live under two sovereigns: their state and the U.S.

Maybe so, but then we need to come up with more words. As the UK proved, a nation in the EU is free to leave the EU, but a state in the US is not. (Without consent of the US, of course.)

> As the UK proved, a nation in the EU is free to leave the EU, but a state in the US is not. (Without consent of the US, of course.)

Which article or amendment of the U.S. Constitution forbids a state to leave without permission of the U.S.?

There is of course no clause of the constitution that forbids secession but SCOTUS currently interprets the constitution as creating an "indestructible union" if I recall the language correctly.

And good luck finding a good vehicle to overturn that precedent.

The TEU at least provides an explicit process for leaving

When we nitpick,

"country" = geographic unit, "state" = political unit.

The United States is itself a state, albeit a federation of smaller states.

This is a nice distinction, actually. It means we can think about "stateless countries" (like Western Sahara, perhaps), and "countryless states" (like the Sovereign Military Order of Malta).

Stateless countries are also those that cross state borders. Example: Kurdistan, Ireland.

And conversely, there are states spanning (parts of) more than one country: The UK with North Ireland and many others (perhaps England, Scotland and Wales if you consider them separate countries).

Wow, now I feel dated, I remember tons of discussion around this at the time and I remember that T-shirt from the Wikipedia article very well.

While we're at it, here's a more modern take on the RSA t-shirt. The QR code on the back encodes the Perl snippet above it.


This page is still up and running and I always thought this hack was pretty awesome: http://www.cypherspace.org/adam/rsa/

This is still in place, just now regulated[0] rather than outright illegal.

Particularly noticeable during the iOS app store[1] submission process (Android's is somewhat more lax[2] leaving the liability firmly with individual developers)

[0] https://www.bis.doc.gov/index.php/all-articles/15-policy-gui...

[1] https://help.apple.com/app-store-connect/#/dev88f5c7bf9

[2] https://support.google.com/googleplay/android-developer/answ...

That's why there were those "illegal" t-shirts with the RSA algorithm printed out in Perl.

But I have a more pragmatic approach. If nuclear launch codes were written out on t-shirts I wouldn't be happy about it either. I think the real problem is ignorance. The US's main role after 1945, and the role of the UN, was and is to prevent another world war. Whether by virtue or by ignorance they have been successful, with the notable exception of a partial world war in the Middle East.

Having said that, the problem is ignorance towards technology and knowledge and resentment towards talent or individual ability. It's more general fear towards things they cannot understand, or rather, things they understand that they cannot subvert. But, I don't like to reduce myself to a protagonist's syndrome and I can more or less understand why the US government does what they do.

The only real node of certainty in the whole equation is that individual freedom is where the line should be drawn. And unfortunately for the obnoxious prescriptive types, any human can invent cryptography on their own whilst living in a cave.

> If nuclear launch codes were written out on t-shirts I wouldn't be happy about it either.

If the government only found out that its nuclear launch codes were leaked because it saw them written on someone's t-shirt, I would be unhappy about the government, not the t-shirt.

Also, if the government decided to ban the t-shirts rather than changing the codes, I would be even more unhappy.

I remember back in the 90s exchanging PGP keys with my roommate to exchange encrypted emails. It was supposed to be so easy. Just 12 simple steps. Every time.

at least you had a friend to email! I couldn't get any of my friends to do it. "Man we can encrypt our emails." "But why..." "It'd be cool" "This seems hard." "Come on, exchange keys with me." "I don't want to make one."

My high school friends and I settled for using Gain and Pidgin to enable the "secure" icon. :)

Ah, the good old days when I could just plug my IM services into one desktop app. I miss those days very much.

Now I use three Electron apps on a typical work day.

Matrix bridges do that for me today! Currently using IRC and Telegram bridges, thinking about adding Slack.

Sadly, Electron-based Element is still the best Matrix client by far.

The lack of thread support in Element makes Slack bridging very hard to use, sadly.

I needed some time to figure out you're not talking about multithreading in JavaScript/electron, but about this



I still use Pidgin for Skype and Hangouts. For text only it's perfect:

- https://aur.archlinux.org/packages/purple-hangouts-git/ - https://aur.archlinux.org/packages/purple-skypeweb-git/

Works even on old and crappy computers.

Rambox and Franz both do this today. They are electron apps but you only need one for all your services.

For awhile you could use Pidgin and OTR to even have secure messages through facebook's messenger. I'm not sure if they decrypted it or not, but it sure was fun.

Reminded me of my Adium days.

Back when I liked my messenger app on the Mac.

Apple has tried hard to degrade the experience of their built in messenger.

Been a hot minute since I've thought about Gaim. Thanks for the memories.

Yep, I meant Gaim. Damn you, autocorrect!

Gaim is just Pidgin's original name.

I assume he first used it when it was called Gaim, so he has stronger memories associated with that name than with Pidgin.

PGPs trust levels are what made PGP never take off: even laypersons could see this is theater, not security.

For reasons unclear to me Thunderbird chose not to go with something like autocrypt.org, but stick with standard PGP and implement parts of their attempt to simplify, which isn't nearly enough to get regular users on board, never mind implement things like forward security, which autocrypt does.

A missed chance from Mozilla, secure email would fit in well with their mission.

There is an autocrypt plugin for Thunderbird:

* https://addons.thunderbird.net/en-US/thunderbird/addon/autoc...

Now that most SMTP connections are encrypted with STARTTLS on the wire, autocrypt is not that valuable. At an email server autocrypt can be trivially man in the middled with just a simple script. It ends up being just another encrypted messaging solution that skips the hard but crucial problem of identity.

Added: Autocrypt does not do forward secrecy. Most people want to keep their old emails, so forward secrecy wouldn't add much value:

* https://articles.59.ca/doku.php?id=pgpfan:forward_secrecy

I know, and I use it. It however wont support TB 78.

I used to love PGP, but I now think encrypted email is a bad idea. https://latacora.micro.blog/2020/02/19/stop-using-encrypted....

Better to use a protocol designed with encryption in mind, like Signal, to get forward secrecy, avoid leaking metadata, and have encryption always on by default.

UPDATE: I have been reminded that PGP does not have to be used with email. I meant to say that I used to love using PGP with email, since that is the primary way I have used it. I will not comment on the use of PGP outside of email, since I haven't carefully examined its use in other contexts.

That is the single most common misconception around PGP, and it comes up every time:

(Open)PGP is first and foremost a flexible packet format (and other specs), and GnuPG is more of a CLI "library" to interface with it -- all of it. You can build something that hides metadata, you can have forward secrecy, and encryption always-on by default with PGP (and GnuPG). You can use it for whatever trust model you want, neither OpenPGP (the specs) nor GnuPG prescribe a certain model. It provides building blocks. It just happens that no good client exists and no more high level specs were written that use it, which is highly unfortunate. The client in this case here used to be "Enigmail" (a wrapper around GnuPG), and now is built-in since plugins are not allowed to be as powerful with the new browser architecture that Thunderbird piggybacks on and this was the only way to bring PGP to Thunderbird users. Also, there are some more modern libraries nowadays for standard use cases around PGP.

Signal was able to move faster since it did not exist, did not try to build things in an open and collaborative fashion, still refuses to work in any open way. Its founder Moxie even openly argues against standards [x]. I am not saying he does not have "a point", but in the long run this will lead to just more silos and ultimately technical stagnation, and for me goes against the ethos of "a public and open inter-net." (and the learnings behind it. 'Those who cannot remember the past are condemned to repeat it.')

That said, I do agree with your comment on a pure end-user level. It still makes me sad to always see this confused and not acknowledged better in places like HN, where people "should know". Technologists can defend and strive for the most promising long-term solution and "proper way to do it", and at the same time recommend "the best of the bad that is currently available". Even if it is confusing sometimes.

Just to give an example, there are plenty of high security use cases that require using smartcards. I'm very grateful that the OpenPGP standard and GnuPG exist that (can be made to) work in such cases. Signal does nothing in that space, rightfully so. But you are comparing different fruit to each other, which is kind of unfair.

[x] and still calls himself an "anarchist". You would think those know better...

I understand that PGP doesn't have to be used with email. That is why I haven't commented on PGP, I'm saying that I think encrypted email is a bad idea, regardless of whether you use PGP or something else. We're posting in a thread about Thunderbird, an e-mail client :)

I agree that one great weakness of Signal is its centralization, and that decentralization is a great strength of email.

For a decentralized encrypted communication channel, Matrix may become a good option now that it is encrypted by default, but I haven't studied carefully the quality of its encryption beyond that yet.

Regardless, I really do think a new protocol is required for secure communications and that email won't cut it. If Signal or Matrix can't do the job for some applications, we'll just have to try again. I cannot recommend email for secure communications, and in any situation where security matters, I have to prioritize people's safety. In terms of security, the best that is available in the realm of encrypted email is not and cannot be good enough.

For future reference, statements like this:

> I used to love [A], but I now think [B] is a bad idea.

will lead many people to think that you consider A and B to be essentially the same thing. In the specific case of PGP and encrypted email, this is a category error: you are implying (perhaps unintentionally) that PGP’s only use is for encrypted email.

Fair enough, I've updated my top comment with an attempt to clarify.

> GnuPG is more of a CLI "library" to interface with it

It abjectly fails at that. It's just awful to interface with.

Agreed. It has been long overdue that alternative OpenPGP implementations exist that try to address some of the peculiarities of GnuPG -- most of which are [still] there because its founder wants to preserve compatibility at all costs to support some of its long-term institutional users. And, yes, dealing with these peculiarities should not the responsibility of end users, but of further abstraction layers built on top of it (of which there are only a few, and all of them focused on the email use case).

The constant confusion between OpenPGP (the standard) and GnuPG (one implementation) has led many users to look elsewhere, which is unfortunate, but has also led many developers to look elsewhere, which is just sad.

Yes, trying to work within the existing standardization bodies can be very painful and disheartening, but the few people who try and survive in that space can use every good soul willing to assist.

The OpenPGP community has done hundred times more than Signal for the state of security on the Internet, and I'm sure it will continue to do so. Signal is great, and pushed the limits quite far (and still does), and I'm very grateful that it exists and use it every day.

The OpenPGP community would do more for security if they listened to serious cryptographers and began recommending better solutions.

See https://latacora.micro.blog/2019/07/16/the-pgp-problem.html for more on that. And it isn't hard to find lots and lots of cryptographers agreeing with the thesis.

People in this industry use OpenPGP because it's flexible and amendable to almost any usecase you can think of. "Better solutions" are usually indeed better but are also so specialized for their purpose to the point that they can't be easily used for any other purpose.

OpenPGP is used to secure everything from simple messages and email to authenticating OS updates for most servers today.

Should they use something more specialized now that it exists? Sure, but that an argument for PGP not against it; PGP is useful in situations where no specialized solutions exist or are inadequate.

Until something comes along that can cover every situation where OpenPGP is useful, I can't see people stop using it (much to the dismay of the 5 vocal crptologists that keep arguing against it).

It's telling that the most common example of a widespread use of PGP (modern messaging applications exchange more messages in a day than OpenPGP has ever exchanged) is software update schemes, because software update cryptography is both a solved problem (just use signify) and doesn't have network effects; it's a "trust anchor" application.

At least with PGP email, you can make the argument that PGP sticks around because people don't want to recreate contact lists. But even that argument doesn't apply to update.

Backup, archivization, password managers, the list is long. Duplicity has many users: http://duplicity.nongnu.org Pass is also pretty popular on HN: https://www.passwordstore.org Both use GPG.

I use pass and I would switch in a heartbeat to a fork of it that used ssh keys or something similar instead of gpg. For something so amazingly simple and useful, its dependence on the klunky mess that is gpg key management is an anchor that weighs it down.

Key management is a burden in every cryptosystem. I'm using KeePass and can recommend it, it works well.

Would you know if it failed?

If it would "fail" and there would be no consequences so I could't tell if it failed or not - would it make a difference?

If the failure were discovered by you a year later, realizing that all you thought was protected was in an adversary's hands.

I'm suggesting that "seems fine so far" is not effective at evaluating solidity of cryptographical usage.

> software update cryptography is both a solved problem (just use signify)

Well, just use TUF [1] and in-toto [2] ;)

[1] https://theupdateframework.io/

[2] https://in-toto.io/

Note that TUF is great for things with multiple contributiors (think npm or pypa).

For the simple case of "a single publisher publishes update for a single product", TUF is an overkill. Something like signify or seccure will be way easier to set up and use.

signify is nice when key distribution, revocation, and rotation is handled for you... but how do you do that securely for many different publishers on a single repo?

OpenPGP is used to secure everything from simple messages and email to authenticating OS updates for most servers today.

And for MOST of the places that it is used, it gets screwed up in some way that makes it not as secure as the people using it thought that it was.

Stop and think carefully about that statement. And repeat it to every person you meet who thinks that they are solving their problems with OpenPGP.

What do you mean by PGP "not [being] as secure as the people using it thought that it was"? Can you mention something specific?

Here is something specific.

Due to the complexity of the PGP system, there are a plethora of downgrade attacks. Where something that was supposed to be at one level of security can be tricked into doing something much less secure. See https://twitter.com/xmppwocky/status/1291144278953955328, https://mailarchive.ietf.org/arch/msg/openpgp/JLn7sL6TqikUf-..., and https://www.eff.org/deeplinks/2018/05/pgp-and-efail-frequent... for three different examples of such attacks against PGP in recent years.

The first one appears to be some sort of joke.

The second one is just yet another person discovering that the MDC check can be stripped off a message.

The third one seems to be just EFAIL which is not a downgrade or any attack really against PGP.

What industry do you mean by "this industry"? Just... computing?

I didn't know OpenPGP was used to authenticate OS updates for most servers today. Can you give me a place to find out more about that; are you talking about a specific OS?

Packet managers on pretty much every Linux distribution use GPG for verifying packages.

It's used to digitally sign software, not encrypting.

What is a good non-email use case for the PGP format?

Any usecase forwhich no specialized solution already exists. Source code signatures and Apt/RPM packages are good examples of this.

What does OpenPGP bring there? At least GnuPG has the benefit of being a tool that's present on many system and which is capable of verifying signatures.

If you're not using GnuPG, you can pick any format you want! What does OpenPGP add? Everything I can think of is a negative.

OpenPGP is the standard, gnupg is a implementation.

Right, so why talk about the standard when you mean, specifically, GnuPG?

If you want an explanation of why GnuPG is dangerous for anything besides email, let me know. But parent post was very specifically NOT about GnuPG- it was about OpenPGP.

because gnupg is not the only implementation.

Yes, that's what I said, and why I was asking!


It's the generic term.

It is used routinely on the dark nets by pasting encrypted messages into web forms.

Who said that it's for a user to interface with directly? mutt interfaces with it perfectly fine, and mutt's PGP UX is also fine.

This is a weird argument. By your logic, there's practically no cryptosystem PGP can't be; after all, it's an omnibus standard to which people routinely add extensions for new algorithms. Want a double ratchet? An elliptic curve triple handshake? The "building blocks" are all there, all you have to do is use PGP's "flexible packet format" to carry them.

This is an interesting interpretation of what he stated.

What I got from it is that the strength of PGP is in its flexibility and that it gains that flexibility from being standardized and open.

Of course you can extend PGP (or any protocol really) to include cryptographic handshakes, but that's besides the point.

(Open)PGP is first and foremost a flexible packet format

That makes an even better case that PGP is not much of a modern secure system generally, rather than it just being bad for secure email.

Because packet formats are bad?

I don't understand what you're trying to say here.

Flexibility in cryptography is often very bad and opens doors to downgrade attacks or things like JWT’s alg:none issues.


“Have one joint, and keep it well oiled.”

The linked article is all about protecting data in flight like in the TLS case. PGP is all about protecting data at rest. The two applications are fundamentally different.

I think you're misunderstanding the general complaint or maybe not the complaint but how general, rather than specific it is.

At the top of the thread someone says 'PGP has all this ancient cruft and also isn't good for email/email is maybe unsecurable anyway'. The PGP-enthusiast response to that is 'No problem! You see, PGP is so flexible, you can build anything out of it, including maybe an actually-secure system of some sort'.

This is a fundamentally inadequate response and we've realized it's inadequate and don't really design secure systems like that anymore. In fact, this class of concern has permeated almost all aspects of contemporary software systems design rather than being something narrowly limited to 'crypto algo agility is bad'. Here's an example/illustrative timeline that doesn't have anything to do with 'data at rest'.

1998 Netscape: Behold our cross-platform, cross-language super-toolkit for building super-extensible Enterprise Groupware apps. Also one of them is a web browser!

2015 Mozilla: [belated forehead slap] This is a terrible way to architect a web browser that has any hope of offering end-user security.

How do you do a downgrade attack on, say, an encrypted backup?

I guess I don't understand how that's a response to anything I've written in this thread.

Edit: I'll expand the quote from the original thing a bit if it helps:

(Open)PGP is first and foremost a flexible packet format (and other specs), and GnuPG is more of a CLI "library" to interface with it -- all of it. You can build something that hides metadata, you can have forward secrecy, and encryption always-on by default with PGP (and GnuPG). You can use it for whatever trust model you want, neither OpenPGP (the specs) nor GnuPG prescribe a certain model. It provides building blocks.

This is a huge 'nope'. It's, by this point, an uncontroversial, well-understood nope. A known-nope, if you will! The only people who don't seem to think so are PGP people and that, in itself, is rather telling.

You can't just "Nope" this issue. You need to come up with some sort of rational argument. The PGP people have been dealing with the data at rest issue since forever. It is a bit presumptuous to just write off all that acquired wisdom without reason.

I have come up with a rational argument, you just refuse to address it at all. Your argument boils down to 'the PGP people are smarter than everyone else' which I think isn't really even a rational argument.

There is no reason to appeal to authority here when a simple koan can provide enlightenment. I will repeat it one more time:

How do you do a downgrade attack on, say, an encrypted backup?

No. We've got a threadful of arguments here in which you simply refuse to engage. You don't get to superciliously demand Socratic satisfaction, wave around your list of logical fallacies and mumble oracularly about koans and enlightenment. You have to make an argument otherwise what you're doing is simply rude preening. It's fine if you have nothing to say. You don't get to (at least, publicly) pretend you've actually said anything and that it's everyone else who has to meet your precise terms of discussion. Sorry.

This all started when you said something to the effect that "packet oriented formats" were bad. When asked to clarify you switched the topic to how flexibility in cryptography was bad because it could lead to downgrade attacks. I pointed out that downgrade attacks were not really possible in a data at rest application and that the article you linked to was about data in flight applications.

That's it. That's the actual arguments to this point. Then you just said nope.

You are not claiming that logic and facts are irrelevant here, are you?

you switched the topic to how flexibility in cryptography

I never said that. You're replying to someone else.

Because flexibility brings bugs, and bugs in cryptography means breach of security.

Because a flexible 'format' or 'model' or whatever with which you can construct insecure systems or (hypothetically) secure systems, is not really useful, it ends up being inherently insecure.

> Moxie even openly argues against standards [x] and still calls himself an "anarchist".

On the face of it, that sounds like exactly the position an anarchist would take...

If you believe in abolishing hierarchy and institutions where they have no legitimacy, _all_ you are left with is communication, community, and open standardization as practiced in anarchist bodies such as the IETF.

> anarchist bodies such as the IETF

That's an incredibly... unconventional read on the IETF.

Not really. I know very little about both anarchy and the IETF, but voluntary organisation (as in opposed to wage slavery) and a democratic governing of the group is very much in line with many anarchistic schools.

Edit: language fixes. English is not my language of choice for discussing things like this...

Anarchists would be philosophically opposed to top-down, mandated standards, not standards in general.

Signal is great as an all in one solution if encrypted messaging is a hobby. It is also very good for mobile and encrypted occasional messages.

If you try to actually build a secure environment within a group that tries to maximize security while getting real work done you find you want to be encrypted by default at least with each other. Signal is pretty suboptimal for heavy volumes of messages. If you and I have three threads going they are all jumbled together. If I want to send to more than one person I have to whip out my phone and form a group and name it.

PGP is imperfect but with the right settings and defaults it is far better than having email default to clear text. And in any long term endeavor with more than a few people you will find you want email.

Signal is great for what it does. It is not designed to be a high volume working tool like email though.

So you're saying that if you're building a professional secure environment, you don't need forward secrecy and it's ok to leak metadata? This doesn't make sense to me. The US gov't kills people based on metadata: https://ssd.eff.org/en/module/why-metadata-matters

It's not possible to make email secure, the flaws are on the protocol level. To fix it, you would need to change it until it is no longer email.

I’m saying people will use email in any decent sized group over any significant length of time and it is very good to encrypt that email by default.

If you’re saying Signal is strictly more secure I agee 100%. It’s just not suitable for using for large amounts of comms within a group. I wish they’d improve it and have even detailed features I think they should add (I made an HN thread when they got that donation from the whatsapp guy).

For now it’s not realistic to expect people to know how to put ALL sensitive comms in Signal. You need to build an environment where as many channels as possible are secure by default. Setting everyone up with GPG and using it by default actually works in group settings and is much better than not doing it. The “just use Signal” meme is wrong. Because you can’t. Not at high volumes.

(Also, metadata is a lot less important in the context of a group that openly associates with one another as a great many do. It does suck that subject lines leak. But better to encrypt the message body than throw up hands and give up because no perfect substitute exists. One learns not to put much info into subject lines. You can always just not use them. Signal does not have them.)

What about Matrix/Element for heavier communications? It's at least encrypted by default now, although I haven't carefully studied their security otherwise.

Matrix is probably the only real alternative to email, I agree. Hopefully it becomes dominant.

> and it's ok to leak metadata?

How could Signal (or other client-server protocol) not leak metadata? It is true that OpenPGP leaks more metadata than necessary (e.g. Subject), but seems to me that any efficient message protocol needs to leak at least three most important metadata - source, destination, time.

One could avoid leaking destination by broadcasting encrypted to many receivers (when only the true one can decrypt it) and therefore server does not need to know true destinations, but that is rather inefficient.

Signal has been working on features like "sealed sender", which encrypts the source metadata: https://signal.org/blog/sealed-sender/

I don't know if they can do anything about destination or time, but even hiding the source seems like a significant advancement.

How does that even work? The only way to encrypt the sender is to send every (encrypted) message to everyone and let the clients drop the ones they can't encrypt. On a Phone. I hope they give out free batteries.

The post mentions some kind of "short-lived" pseudo-sender, which is vulnerable to the same metadata analysis.

Encrypting the sender is pretty easy. Deliver the encrypted message+sender at the destination. Only the receiver will be able to decrypt it and see who the sender was.

Encrypting the receiver is a lot harder though. It will probably involve dropping off the message at some central location and some very fancy cryptography. Secure multi-party computation [0] will probably be involved. I don't know if it can be made scalable though.

[0] https://en.wikipedia.org/wiki/Secure_multi-party_computation

> Encrypting the sender is pretty easy. Deliver the encrypted message+sender at the destination.

That's the point. If the sender is encrypted, what's "the destination"? The IPv4 space? A random ID that represents the sender? How does that change the metadata angle?

If I deliver a bunch of bytes at jet@spiegel.com then that's the destination. Now you can decrypt it with your public key and get

`Have a nice day. Regards, sobani`

and see that I'm the sender.

If I'm worried about leaking my IP address, I can use something like Tor.

I'm confused why you think the sender needs to be known in any way to allow bytes to be delivered to your mail server.

Signal pumps all traffic across AWS infrastructure, where the simplest of all traffic analysis can deanonymize its users. Signal relies on phone numbers (for reasons), which again is not exactly the best "metadata" to carry around. If you look at the NSA material in the Snowden cache, it becomes clear that the whole "US kills based on metadata" is fed by piggybacking exactly on such systems, like deanonymization of simple VPNs and other "streams of data with simple proxies in between".

PGP can be used to encrypt headers such as Subject. Mail addresses can be temporary and pseudonymous much more easily than phone numbers. One could imagine building something like "Signal" on top of PGP and "email" that has all the "metadata hiding" properties of Signal, and more, e.g. by introducing delays and random relaying order on SMTP level.

Deltachat is an example of an open project in that space: https://delta.chat/

Ah, the "sufficiently sophisticated PGP usage", where everyone uses ephemeral email accounts for each message, subjects are encrypted, nobody can reply plaintext, and nothing runs over an infrastructure that can be traffic-analyzed. That "Deltachat" doesn't accomplish this is besides the point, of course.

One can imagine all sorts of thing. Are you saying that you routinely send PGP email with Subject headers encrypted, using temporary mail addresses and pseudonyms? This is your routine email use, with a bunch of correspondents who do the same? Or, along with "random relay ordering on SMTP level", it's just something you're imagining? If you're saying the PGP use you imagine would be more secure than the Signal use that actually exists... I mean, I guess that's cool.

> If you're saying the PGP use you imagine would be more secure than the Signal use that actually exists...

To present an iron-man argument instead of a straw-man, imagine they were comparing Delta Chat to Signal, and pointing out that Delta Chat does encrypt subject headers[0], and that Signal users typically don't use temporary phone numbers (which are harder to obtain than temporary email addresses).

If you want, you could try comparing Signal's proprietary network versus the global SMTP network in terms of how secure they are against traffic timing analysis.

[0] https://delta.chat/en/help#how-does-delta-chat-protect-my-me...

If you're building a professional secure environment, forward secrecy is a tradeoff that you need to tune (and OpenPGP gives you the tools for doing so, viz. subkeys and expiration), you absolutely need federation, and identifying contacts by phone numbers (as Signal does) is a zillion times worse than leaking email headers.

It's not possible to make Signal secure, the flaws are on the protocol level. To fix it, you would need to change it until it is no longer Signal.

> it is far better than having email default to clear text

You have your email set up to "default" to PGP? Can you say more about what you mean by that?

Your article seems to focus on individuals. Consider also organisations.

Transport-level security and authentication of email content is a perfectly valid use-case for an organisation when protection against third-party interference is desired. They don't need to worry about forward secrecy, they just need attachments to be transmitted in a legally-compliant manner.

For example each month HR email me my payslip as an encrypted attachment. I decrypt it and save locally. They just have a batch job that encrypts for each user and sends. They don't have to worry about who uses which IM client. They don't need to care if I self-host or use Gmail, because their ligation is simply to keep the information secure in transit.

You are also too keen to support Signal's use of phone numbers as identifiers. That's a design choice, instead of using client-managed identifiers, and makes it unsuitable for organisational use. Whose phone will we use to send the deposition to the court... and who in the court will have a phone with Signal on it? Email by contrast is universal and integrates well into organisational processes without dependency upon individuals.

Autocrypt is the middle ground Thunderbird should have implemented (and which Enigmail used to offer). Email is here to stay, so encryption by default won't happen as long as the PGP standard is used as designed (trust levels and all). Autocrypt improves all that horrible UX, including secure key transfer or rotation, where you can keep doing your own key management if you wish, but you have to do nothing more than enable Autocrypt if you don't. It's a mystery why Mozilla didn't push this, seems a perfect fit for them to empower regular web users.

From what I've read, PGP for Thunderbird is just a first step. My guess is that they chose the easiest/quickest path for first implementation, but I think we'll see more facilitation of encrypted email in TBird in time.

This makes sense to me. I've been trying to get friends, family and colleagues to encrypt email (hell, even signing would be a step!) for about 3 decades, now, and have basically thrown in the towel. So anything that affords a small toe-hold to begin the painful process of building the necessary network effects for the idea of encrypted email to gain traction is a good thing.

The Autocrypt plugin was that toehold for me. Together with a mobile client like K9mail it works beautifully.

I don't know much about Signal, but who manages your private key? Who does the encrypting?

With PGP there is zero chance of anyone decrypting my data unless they have my key and password, which isn't uploaded anywhere and no apps have access to it.

That would make sense for people who consider privacy as single most important measure for communication system. That is not true for everybody. The advantage of OpenPGP is that it does not break any existing advantages of e-mail (except perhaps simplicity) so it is unequivocally better than unencrypted e-mail, while other protocols may have better encryption / privacy, but are worse in other measures.

For me, open-standard-ness and federated-ness are two measures that i consider even more important than cryptographic security. So communication protocol that does not satisfy either of these have little value to me, i would rather use e-mail.

I find this kind of arguments ridiculous. Sure PGP is not perfect in all cases, but advocating not using it at all is like throwing away the baby with the bath water.

And personally, I think the points made it the linked article are weak.

Yeah. PGP doesn’t offer forward secrecy. Solution? Use Age!! which also has no forward secrecy!

Apps like ProtonMail or Tutanota may have an impact on encrypted email. If both sides use ProtonMail, communication is end to end secure. That’s also the case with encrypted messaging. In both cases, copying outside an incompatible platform may be insecure. At least, email address is more private than phone number.

> If both sides use ProtonMail, communication is end to end secure.

If both sides use [the same provider], it's not mail anymore, it's something internal. That is the fundamental issue of ProtonMail/Tutanota/any service provider that pretends to solve end-to-end encryption in email: without standards, it's a proprietary system. Today the only viable path towards easy E2E encryption in email is Autocrypt. AFAIK only Posteo is working towards including it.

Does Signal still require you to use your phone number to use it?

Credit to the people that wrote and maintained bugzilla, both as software and this particular instance. It's still ticking, much longer than (I assume) they planned it to.

My opinion for 20 years has been that Atlassian JIRA would have been stillborn if somebody had added a blue and white CSS theme for Bugzilla.

No no no you don't understand, the end users NEED drag and drop (and loading indicators, and slooow and heavy pages to admire those indicators).

> the end users NEED drag and drop

Ummm... they do.

Never seen a timestamp on the web that said "20 years ago". Wow.

Reminds me of a tiny blog post I put up years ago about the future of archaeology. Forgive the silly site title.


> This hypothetical archaeologist has it easy

I am not so sure of that. Even time stamps cannot be trusted very much.

For example, a lot of photo cameras, audio recorders and other such devices still don’t have a small extra battery for keeping track of time. So every time you run out of battery, or in the case of wall-connected devices you unplug them from the wall, the clock is reset and time is set to some date that the device manufacturer decided to use as the epoch for their device.

And most of these devices don’t have a GPS receiver in them either, so you need to manually set the time. And only some of them will prompt you for it directly while for others you need to remember this and go in the settings menu and adjust the time.

So I routinely get time stamps that are things like 1st of January 2008, or 2015 or what have you. And sure if you notice right away you can correct it. But even when you know you sometimes get tired of that kind of stuff so you just leave it like it is.

And that’s just at the recording stage. Then you transfer files between systems and sometimes you get the original time on your files and sometimes you don’t, and even if you did they might be wrong in the first place as per above.

And all of this is for someone who knows how this stuff works and who tries as good as they can to have proper dates on files and to keep them that way. Then you have people who don’t know and don’t notice.

And then you have people who willfully modify metadata.

And then there is the data itself. We live in a post-truth age they say.

How do you know in a thousand years that the data is from when it says it was from and furthermore that there is even a grain of truth in the data itself?

I am sure there will be competent historians that are able to learn a bit of our time from our data. But I don’t think they will have it easy.

And on top of that I don’t think data can really tell the future what life today is like. It’s hard to put in words what I mean by this last point so I won’t try to say any more about that.

Recently I was delighted to find that Lurker’s Guide to Babylon 5, one of the first websites I remember really immersing myself into, is still online and even updated every now and then! I think the site has been continuously up since 1994, and most of the content is over 20 years old, published when the series was originally run. It’s a real treasure trove for any B5 fan and likely the most comprehensive and well-curated collection of B5 knowledge in the world.

[1] http://www.midwinter.com/lurk/lurker.html

I was born 20 years ago...

It would have been hilarious if you'd fixed this bug... :-)

Now that you mention it, I'm surprised I can't think of any example of someone fixing a software bug that is older than them. (I'm assuming Warner Losh is older than 35).

Man the last company I worked at had bugs that had been updates going back over 30 years. Kinda surreal looking at bugs that are older than you are.

Dropbox respects the last-modified date on your files, so it shows things like “modified 29 years ago”, which is funny to see on an iOS app.

Thunderbird has the only calendar I know that has a "multiweek" display as opposed to (well, in addition to) the utterly retarded month view that exists in every other GUI.

We've been doing electronic calendars for how long now? Why are we still using a paradigm from paper based calendars? At the beginning of a month I can see three weeks ahead, but at the end of the month I can see three weeks behind. It frustrates me no end that this is still a thing. It reminds me of the early days of Google maps when they were no better than paper maps, but now we can rotate the map, zoom in and out etc. But calendars are still no better than paper calendars. Apart from the one in Thunderbird.

Multiweek is pretty good. But I wish it had a better "multi day" view. Week view is great, a column per day, with the days neatly split into hour blocks. But my screen is huge. It could fit much more than 7 days with just as much detail. As soon as you switch to Multiweek, all that detail is lost. I guess what I'd like to see is a "Multi day" view, which starts today and displays the next N-days with as much detail as my screen allows (or N configured days).

Sorry for nit-picking and being off-topic (I get your point!), but I don’t think there was ever a version of Google Maps without zoom. Actually, I have the fondest memories of the first version of Google Maps (the orange-themed one). It was so much better than anything it replaced at the time, and I think it would be perfectly usable even today, 15 years later!

It did have zoom, but they were fixed levels so no different to having multiple paper maps at different scales. Yes, of course there is the advantage that it's "not paper", but that was the only advantage really. This is not unexpected at all as new technology very often mimics existing technology in its first iteration. If you look at the first outputs of the Gutenberg press you can see they were trying to mimic handwritten books of the time. But usually the new technology very quickly surpasses the old after the first iteration, as electronic maps have now done.

I do think there ought to be a way to do good cryptography in email. Email is not going away anytime soon, so giving up on it as a legitimate place where cryptography is needed seems too ivory tower for me.

The “dead simple solution” is to just run the Signal protocol over SMTP, although I’m sure it’s possible there is a better design if you were to think about the specifics.

A slightly more realistic "dead simple solution" might be for mail clients to extend their OpenPGP support to include Autocrypt[0] which would allow users to gain some of the advantages of OpenPGP without having to understand any of the details.

[0] https://autocrypt.org/

Interesting. In my opinion, this should also contain provisions for storing the private key password-encrypted on the IMAP server.

I think that's what the Autocrypt Setup Message is for:


S/MIME is the closest to dead simple solution, but it requires trusting certificate authorities.

It's a much better user experience, and honestly I'm surprised that no enterprise orgs have adopted it, because it would probably be cheaper than all this phishing training.

Enterprises, at least in the form of the U.S. federal government, have but there are two key drawbacks:

1. At least until recent, Microsoft implemented it as blocking code in the UI thread – open a message and Outlook won't paint until it can verify the cert, access your local key store (hope your token is in a USB port which is 100% reliable), etc. If you thought “Does that mean that revocation checks block the UI until a network process completes?” you're sadly right.

2. Adoption hasn't been enough to be able to ban untrusted senders. This could still be quite useful for, say, a hard requirement that *@example.com must have signatures but it doesn't help with really common phishing tactics like pretending to be a vendor, business partner, etc.

3. You definitely need to get serious about key management because your ability to read your old encrypted email depends on retaining those keys. This is obviously not impossible but it has a cost when you think about the need to store them securely in a manner which can be restored after the inevitable system failures and compromises.

> 2. Adoption hasn't been enough to be able to ban untrusted senders. This could still be quite useful for, say, a hard requirement that *@example.com must have signatures but it doesn't help with really common phishing tactics like pretending to be a vendor, business partner, etc.

Preventing forgery of @example.com is nice, but sadly doesn't matter that much, because a message fro "Your Boss" <totallyfake@superscammy.example.org> is just going to show up as Your Boss in the UI with most clients these days, amd Enterprise oriented clients are worse than the norm. The fastest way to see the actual addresses is to just press reply, which is irritating.

Or even “hey, I’m away from the office and lost my company phone. Can you help me …” attempts using an obvious third-party service. All of these have fooled people in the past and you really need high adoption rates to seriously reduce the odds.

The one that almost got me was

      From: "John Q Boss"
      Subject: "the blog is down"

      Can you check <a href="http://0-dayfuntimes.example.org">http://blog.example.com</a>
Of course, that came in on my phone, as I was leaving for the day, and phone doesn't show email address or link destination. :(

> blocking code in the UI thread – open a message and Outlook won't paint until it can verify the cert

Yikes. I should probably be surprised, but I guess I'm not. This sort of thing goes a long ways towards explaining all sorts of other oddities or using Outlook.

Outlook is _old_ and predates threading being a mature OS primitive. I remember reporting bugs of this class against Office 97.

They seem to be extremely hesitant to change it, which I assume is due to the corporate plugin ecosystem.

S/MIME does not protect from phishing.

Not if you want to interoperate with anything currently in existence. And if you don't, why bother building it on SMTP?

You build it on SMTP because upgrading clients is easier than doing a clean slate redesign of ubiquitous internet protocols.

Presumably we're discussing how an open protocol addition might gain any traction at all over walled garden protocols like Slack -- and in those cases you want to maintain as much compatibility as possible.

"Federated SecureEmail 2.0" would be dead-on-arrival, where "Secure Client on top of bog-standard email" is ever so slightly less DOA. You get to re-use the existing identity/routing system, existing servers, existing authorization, etc.

>You build it on SMTP because upgrading clients is easier than doing a clean slate redesign of ubiquitous internet protocols.

Why not just toss the whole shebang and rebuild it below that layer? Signal Protocol seems to be pretty successful here.

>Presumably we're discussing how an open protocol addition might gain any traction at all over walled garden protocols like Slack

No, I'm asking how a new open protocol can be built on top of email in a way that maintains strong backwards compatibility while offering strong security guarantees, like end-to-end encryption. I don't think it's possible.

> No, I'm asking how a new open protocol can be built on top of email in a way that maintains strong backwards compatibility while offering strong security guarantees, like end-to-end encryption. I don't think it's possible.

That's exactly what Autocrypt is doing. They're still at level 1, ie opportunistic encryption: try to encrypt if possible, default to plain-text if not. That's 100% backwards-compatible.

What we all want is a system where we are sure that messages can't be sent in plaintext. The only way this can happen is if MTAs actually read messages and check if encryption is applied; if not, reject it. That's something that can happen but by definition it can't be backwards-compatible.

> No, I'm asking how a new open protocol can be built on top of email in a way that maintains strong backwards compatibility while offering strong security guarantees, like end-to-end encryption. I don't think it's possible.

You can't have strong security guarantees with backwards compatability, because backwards compatability requires plain text sending.

Unless you're OK with limited guarantees like the UI will tell you when the mail could be sent in plain text. Or if the UI tells you the message will be end to end encrypted, it won't fallback to plain text, etc.

>You can't have strong security guarantees with backwards compatability, because backwards compatability requires plain text sending.

That's right.

>Unless you're OK with limited guarantees like the UI will tell you when the mail could be sent in plain text. Or if the UI tells you the message will be end to end encrypted, it won't fallback to plain text, etc.

This would require changing all the clients.

> This would require changing all the clients.

Not really, the old clients would never send (or receive) end to end encrypted messages, so they don't need new UI to tell you that. That's the cost of unbounded backwards comptability.

If you want to have 100% coverage of email users, changing all clients is part of the deal. But, uhhh, good luck with that. Server based standards are an easier lift --- you could build a standard to require hop to hop encryption on mail and expect that to plausibly gain enough adoption to use over a managable time frame. Of course it would be trivial for a hop in the delivery path to subvert that and remove the request, and all hops in delivery would still have message content access; but if people like it, a large majority of servers might support it in 5-10 years.

>Of course it would be trivial for a hop in the delivery path to subvert that and remove the request, and all hops in delivery would still have message content access; but if people like it, a large majority of servers might support it in 5-10 years.

You're describing TLS, and this is already happening.

Not just TLS, but a way to say, while composing a message, if it's can not be delivered via TLS with a valid certiticate, then don't deliver it. Which apparently already exists as REQUIRETLS https://www.rfc-editor.org/rfc/rfc8689.html

Published November 2019. I didn't see any data on support, and hadn't heard of it before looking just now.

Because there are too many users on SMTP/Email, but not many PGP users to worry about interoperating? I mean, Hey.com shows that people still like SMTP/Email.

Does it? Or does it show that a small number of highly technical users are willing to pay more for a different email experience? It's a bit early to judge their impact on the ecosystem, and I'm skeptical they'll capture a significant user base in the way that Gmail did.

People use email because everyone uses email; you're not going to get people to change how they use email entirely. Nobody cares that it's running on SMTP, they care that they can email everyone they know with an email address.

Aren't there specific legal protections for email in some places, distinct from other online communication?

I wish to try n help with the dead simple solution. Heck, I wrote myself a wiki doc as one of my hobby projects sometime (posted here unedited so pardon me if its not perfect http://txti.es/ax4tq) for a simple, simple, simple app that would do basically signal over a SMTP-like system, if not SMTP itself for v1 bootstrapping when online, and do simple mesh relaying (ok thats probably an oxymoron) up to 7 hops when offline.

I don't feel confident enough to actually try this project yet, but at least I wrote what my ideal chat app would be, oh and came up with the fancy name 'chattaur' for it.

Isn't the "dead simple solution":

1. Write the message.

2. Encrypt the message.

3. Paste the encrypted message into the email client.

Your odds of accidentally sending a message in the clear are zero.

And now to read and reply to it, someone has to copy it from the email client, decrypt it, edit their replies in inline, re-encrypt it, and paste it in.

As the number of people on the email chain approaches 2, the chance of someone accidentally copying+pasting the entire email chain decrypted into a reply reaches shockingly high levels.

A painful manual process where the simpler slightly-less painful path works, but results in insecurity happening, are not a good way to make sure security happens.

> As the number of people on the email chain approaches 2, the chance of someone accidentally copying+pasting the entire email chain decrypted into a reply reaches shockingly high levels.

Heck, people (myself included) regularly mess up Reply and Reply All :-)

Have you ever attempted to get an organization of people to do this reliably?

You could run any encrypted protocol on top of SMTP/IMAP; been there, done that; the main issue is you need specialized software on both ends to make it work.

Still, doing so solves the transport and persistence problems you would otherwise have to deal with.

> The “dead simple solution” is to just run the Signal protocol over SMTP

I'm pretty sure the dead simple solution is to either use SMTP or Signal and not a frankenstein's monster of both.

How do you do key exchange?

Video chat might be good for this?

With the rate that video-based deepfake projects are progressing, video/image-based verification likely won't remain useful for much longer.

I'm more impressed that a ticket managed to live on in a tracker for so long without getting lost over the years.

In contrast to ansible issues, which would have changed tracker or repo ~30 times in that time.

Or any other recent repo where issues get "stale" and closed automatically.

Also: No, no, you're wrong. Ansible is not crashing, you're just holding it wrong.


Previously on HN:

36 comments, 1 day ago: https://news.ycombinator.com/item?id=24501872

226 comments, 2 months ago: https://news.ycombinator.com/item?id=23864934

51 comments, 12 months ago: https://news.ycombinator.com/item?id=21197327

"How much do you trust the owner of this key to sign other keys properly?

* I don't know

* I do NOT trust

* I trust marginally

* I trust fully

* I trust ultimately"

This is a real pop-up I got the last time I tried to use PGP with Thunderbird.

If people still get regular pop-ups like these, I don't think PGP will ever be popular.

They might have switched to PEP (pretty easy privacy) that uses TOFU (trust-on-first-use) so maybe this is thing of the past, but I don't know.

Security has a big dirty secret: it's hard. If what you're doing is easy, it ain't security. Simple as that.

TOFU works if you actually do it properly. That means verifying the fingerprint of the key you just received out of band once time. Now you can trust that key. How many people do this? Even IT professionals just routinely say yes for new ssh keys without checking. That's not TOFU, that's just trusting that you are not under attack which is not security.

TOFU is actually just a degenerative form of web of trust anyway where you just automatically say "I do not trust" to the above so that each new trusted key needs to be established individually. But it does make it easier. Easier to not be doing security at all.

When I observed people actually using Enigmail, they just do "search for that person keys" in enigmail, which did search at MIT or whatever, and picked the first (and only) result.

(Again, that was before PEP/TOFU. In my former company, we had PGP as a company mandate. Everyone hated that, and it never ever worked properly for group e-mails, no matter how hard we tried to hack it. Not sure what is the experience now.)

I know it's virtually forbidden to discuss practical details of PGP on HN, but what do you think is the sticking point about that pop-up? It makes people think a little rather than trust blindly, which seems to be...good?

I do agree that the marginally and fully options are useless. Wonder if adoption would be better if you just removed them, and were left with:

- I don't know what PGP is (exit, opens documentation or a tutorial)

- I definitely distrust this key (because it has been marked as malicious in the wild)

- I trust this key (because I have verified it either in person, or through a proof like Keybase)

Even you misunderstood that question.

It's not "how do I trust this person key", but - "how much do I trust this person to sign other people's keys ".

As in "how much do I think that other person is careful in checking other people's keys". Not judging his key, or your ability to check keys. You are judging the other person capability to check another, third person keys.

This is important for PGP "web of trust".

21 year old bug, and Damon Gallaty apparently still works at the company that he worked at back then, albeit with some changes to the name on the checks he gets.

I have some serious feels for the guy, that had to have been a frustrating experience.

Wonderful. What an amazing amount of work to implement a terrible idea.

See https://latacora.micro.blog/2019/07/16/the-pgp-problem.html for why we shouldn't be using PGP in 2020.

Gee, this anti-PGP rant showed up 3 times so far in this thread. I think that justifies a link to my critique:

* https://articles.59.ca/doku.php?id=pgpfan:tpp

Honestly, I have no idea who you are. I'm not surprised that someone out there thinks that it is a good idea.

However every cryptographer that I know to trust who has bothered to comment on PGP has said to not use it. For example see https://www.schneier.com/tag/pgp/, or https://secushare.org/PGP.

Therefore I won't be using PGP. No matter how much you like it.

Good for them, but PGP is essentially dead for any non-technical zealot users. S/MIME is better supported and easier to use.

That's actually the deal with Thunderbird, too. Supported S/MIME and needed a plugin, Enigmail, to interface with OpenPGP. What is happening here is that the thunderbird add-on interface is changing and they decided that instead of porting Enigmail they are integrating OpenPGP straight into Thunderbird like S/MIME is.

I am so annoyed by anyone saying something is "dead".

I tried to use S/MIME. I really did. Went to those sleazy web pages of companies that issue certificates, was told I'd need to pay for various certificate products. Tried a free option. Tried to load the cert onto my YubiKey, tried using it in Apple Mail…

The conclusion: it is by no means easy or well supported, and requires you to deal with those sleazy cert-celling companies.

Perhaps true for individual mail but some corporate mail is very active with it.

It works quite well within groups. If you use it every day it’s a total non issue. The problem is people who set it up, forget about it, and finally get encrypted message two years later. Of course that is then confusing to try to remember how to use.

Gave me memories of the times when US had cryptography export restrictions :)

sigh Thunderbird has so many great features under the hood, but the hood is the biggest issue I have that prevents me from switching full time. Classic and wide layouts are not optimized for wide resolutions in modern laptops. Their vertical layout is unusable. The columns in the list of messages are horizontal, taking up too much width. Each layout is a tradeoff on whether I want to message pane to be usable or the list of messages to be usable.

I was never satisfied with their layout either. Always liked the way it behaved though. Ended up switching to Postbox, a Thunderbird fork, about 7 years ago and haven't really looked back. You have to pay for it, but it's a pretty solid email experience for what you pay (I think I paid $40 for a lifetime license a few years ago).

Have you tried turning off some of the columns in the message list and using Vertical view? There's an absurd number of them that you really don't need all the time.

I have, but the biggest horizontal space eaters in that view are the most important ones - subject, from, and date. Turning off the others don't really help that much. Besides, Mail.app gives me the all the same information as in Thunderbird's, but the list of messages is rearranged for that view. It doesn't eat into the message plane. Thunderbird's vertical layout is just a re-arrangement of the panes.

Edit: I mixed Vertical and Wide in my original message. I meant the vertical view is unusable. I've edited it.

Each to their own I guess. I use vertical and like it. Left pane on the left is all my accounts and folder tree; middle pane is email star:subject:sender:date; right pane is email contents.

Above the three panes I simply have the message filter box and above that is the main toolbar (get messages, write, address book, view drop down, quick filter button).

Simple but works well for me on 1920x1080 or higher (and dealing with about a hundred or so emails a day).

Is anyone else having problems with this, after migrating from Enigmail on Thunderbird 68?

It fails to associate my accounts with my keys in my keyring, so I try to import an exported key. Whenever I do this, it gets stuck in a loop asking me for the passphrase for an old, revoked key :( Even after I deleted the revoked keypair -- which I know I shouldn't -- it's refusing to cooperate.

So it looks like I've actually lost this feature by upgrading...

Where's the actual diff? It would be interesting to see how much and what sort of code is actually involved.

Nice to see Thunderbird still improving.

It also reminds me of the infamously long delay on JIRA-1369, the 20 year old ticket.


At least it's neat software can be used for 20 years.

As someone who deals with the opsec/public interface (and is a cynic about technology in general), I have to say that encrypted email via PGP has to be the computer security nerd's biggest and longest running "emperor has no clothes".

Remember when someone had to set up a GoFundMe for the one guy that was supporting OpenPGP and everything thought, "let's help this guy out" but at the same time thought Why, though?

Reminds me of this article on HackerNews a little while ago:


Wow, this could have made such a difference to the direction of the internet, government surveillance, and maybe even saved the lives of dissidents- if it had been done 20 years ago.

Has someone made a ranking of the oldest bugs Mozilla has? Maybe even by severity?

I remember some years ago a security ticket was resolved after someone on reddit celebrated its decennial.

Is this the longest time between request/bug and fix?

It's in a different field, and I'm being very tongue-in-cheek - but the Twenty-Seventh Amendment to the US Constitution has it beat.


Cool link

Last month there was a patch for a 35-year old bug in patch: http://bsdimp.blogspot.com/2020/08/a-35-year-old-bug-in-patc...

If it does have that title, there are other bugs that are almost as old in Thunderbird and still open, so it might not hold the record for long if someone gets around to fixing any of those.

For example, bug 41929 [1] is only 6 months younger. Briefly, you cannot have two different IMAP accounts in Thunderbird that have the same username and host but different ports.

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=41929

I think Open Office and Libre Office have some contenders back from Star Office days. I still use LO daily though.

Most likely there is a bug filed somewhere that asks for removing most of the parentheses from LISP. :)

In a way, that's how we got Dylan.


16 years and still pending! All bets are off for this dark horse

It has been 21 years since people asked to have SRV DNS resource record support in Mozilla.

* http://jdebp.uk./FGA/dns-srv-record-use-by-clients.html

I’m thinking that they might implement HTTPS and SVCB records, once standardized:


Now my opinion may seem controversial but I personally believe having feature requests that are old enough to legally drink is not a sign of healthy project management

The bug is closed. What more do you expect project management to do?

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact