Hacker News new | past | comments | ask | show | jobs | submit login
Lavabit code open sourced (github.com)
458 points by jayfk on Apr 30, 2016 | hide | past | web | favorite | 60 comments

I never read the closing letter and it is quite unnerving.

"If my experience serves any purpose, it is to illustrate what most already know: our courts must not be allowed to consider matters of great importance in secret, lest we find ourselves summarily deprived of meaningful due process. If we allow our government to continue operating in secret, it is only a matter of time before you or a loved one find yourself in a position like I was – standing in a secret courtroom, alone, and without any of the unalienable rights that are supposed to protect us from an abuse of the state’s authority."

What is the closing letter? Can you give a link?

Edit: Found the source. Link is: http://lavabit.com/

I was also looking for it, thx

A number of comments in this thread appear to suggest that Lavabit was end-to-end encrypted. It was not.


What does end-to-end mean encrypted mean?

It means that if Alice sends a message to Bob, that message is encrypted with a key that only Bob (or Alice) knows. No server in between them can read the message, so the question of whether they trust the server doesn't matter very much.

Lavabit wasn't end to end encrypted. DarkMail will be.(Assuming it ever materializes)

Dark Mail is not End-to-End either. Last I looked, it had varying levels of security. The most practical one, from what I saw, is analogous to current SMTP+TLS. It gets the keys from the "organization" (aka gmail or your mail server). Given that it is incompatible with SMTP, I don't see the point. Use Gmail, add PGP if you need it.

It's not the original Lavabit, but dark mail.

I'm pretty sure magma is basically the original Lavabit code.

Still will lead to lots of development in the area. I didn't even think of working with something like this due to the learning curve, but this might help kickstart that.

Where's the original Lavabit?

I would imagine a stipulation of the original subpoena would involve not sharing the source of Lavabit. It sounds as though what Ladar has done is re-engineer Lavabit in the form of Dark Mail to bypass any gag orders.

I don't it. LE should love a fundamentally insecure design like Lavabit to get picked up.

When I looked at the Dark Mail draft it was incompatible with regular email. And the trust models it has were basically the same as you'd get with Gmail today. End to end remained difficult (of course).

Plus it has weird stuff. Like a field for political party on all contacts or something.

Curious: what would happen if a bunch of these popped up all over the place and used end to end encryption between each other making email truly secure between each other? Would such a thing be possible? Adopt Mega's model where they store the private key, but encrypt it with the user's password and only the browser has the decrypted copy.

> Adopt Mega's model where they store the private key, but encrypt it with the user's password and only the browser has the decrypted copy.

Unless you're going to audit every single line of code Mega uses on their site every time you use it, that would leave you completely vulnerable to any backdoor included in the code (because of a court order or a $5 wrench).

Secure E2E Web Crypto is a myth.

"Secure" anything is a myth, it's about what levels of risk you are willing to accept.

The bottom line is that yes, this sort of setup would be worse than PGP email, but it would still be better than traditional web mail which the vast majority of people use.

On an interesting aside: You don't have to audit the code every single time. On first execution of the code, it can store itself in the browser's application cache indefinitely, and manage upgrades in the same style as traditional software. This is fairly new ground though :)

> store itself in the browser's application cache

That may solve some of the distribution problems, but no browser-based software can ever be truly secure for a different reason: you have to run the crypto in the same process as the network and parsing code. All browsers have a history of security issues and other bugs in these areas. We should be minimizing attack surface, but browser designers instead decided to add more features that inevitably lead to more bugs.

The browser is an incredible "weird machine"[1], and relying on them for security requires believing that nobody will figure out how to program that "weird machine". The solution is something in the style of the "agent" programs[2] for ssh/gpg. The crypto - especially the private keys - must be done in an isolated process, so buffer-overflows and parser bugs at worst leak only the current data. If the crypto is handled in the browser itself, there is always risk that a bug will allow the keys to be leaked.

[1] https://media.ccc.de/v/28c3-4763-en-the_science_of_insecurit...

[2] ssh-agent(1), gpg-agent(1)

True, browser is more and more looking like a regular OS which executes untrusted programs. As far as security is concerned though I am hoping Servo will solve most of the issues.

I'd argue it's not significantly better than any existing product that uses transport-level security without E2E crypto. Both systems are likely to be broken if there's a server-side breach, though I'd admit there's a bit more effort involved for an attacker (planting a backdoor and collecting keys as opposed to just dumping the whole data set).

Definitely. I think that's why companies that do this (like SpiderOak and Keybase) implement client-side web crypto for convenience use but provide a pretty visible warning about the potential pitfalls compared to the desktop app, so users can make an informed choice.

A lot of people are researching solutions like the app-cache approach to try and fix this, fortunately it's getting better!

>this sort of setup would be worse than PGP email

Why not use PGP then? It's not hard.

I've read this several times, and while some of the points are correct it still feels like a marketing piece from Moxie.

gpg isn't difficult to use, hell my grandmother could do it. Concepts like private keys (this is the key you keep private) and public keys (this is the... well, public key) are trivial to understand.

You don't need to understand the technical details of encrypting and signing messages either, you just paste in the message and the pubkey key and encrypt.

It's by far the best email encryption solution we have right now, and I seriously doubt we'll be seeing anything better any time soon.

> gpg isn't difficult to use

Yet any research we have shows that people, even technical people, make simple but critical errors.

how would you stop a script crafted to overwrite the stored code without letting you know? You still have to audit every time to know that something like this wasn't introduced

Serious request, what do you think about unhosted client side crypto?


I think something like the self-hosted version (file:///) of this is getting close to something I'd consider safe enough. The backdoor problem from my previous post obviously isn't exclusive to web apps - a large chunk of software we use day to day performs updates automatically, and no one's going to reverse-engineer everything after each update.

That being said, the average native app implementing E2E crypto is probably still safer as of today for a number of reasons:

Binaries are typically signed by the developer, which isn't something you usually see in a web app using Web Crypto. Signing keys are easier to protect from unauthorized access, compared to your entire web stack.

With native apps, you tend to have far more control over code changes. You can chose your own update schedule and review new software versions for any critical component. A typical web app could change the code on every request. Web apps also make it a lot easier to pull off a targeted attack.

There's also simply a larger attack surface in general. Anything from XSS to phishing is much easier to pull off in a web app (though, as iMessage showed us, it's not quite that simple). Just think about how much effort browser plugins for password managers have to go through to prevent users from accidentally entering their master passphrase in some phishing form.

@chrisfosterelli mentioned the app cache approach of circumventing this - I guess we'll end up with some kind of trusted zone for crypto-related code in browsers, maybe with code signing and what not, sooner or later, and that might get us fairly close to the security of native apps. I don't think we're quite there yet.

PS: Thanks for writing acme-tiny!

Thanks for bringing the unhosted trend to my attention. I haven't heard of it. I'm a bit skeptical about the graphic in the front page given applications often need to perform computations on the data itself. There's probably something in the site that addresses that. I'll read it in a few days and then maybe look at your tool.

Btw, you need to correct the WebCryptoAPI link in "Security and Philosophy" part of this page:


Clicking it gives me nothing. Copy-and-pasting the link results that resembles spyware. I think it's just missing a colon after http is all.

I've actually looked at the code Mega uses (mainly for downloading without a browser) and the key is not stored on the server but actually part of the URL fragment, which isn't necessary to be sent to Mega, and decryption happens on the client:



I haven't looked at the upload side but that's an interesting datapoint. Not sure how secure AES128-CTR is.

That would avoid transmitting the key to the server, but in the scenario I mentioned, it would still be accessible to the backdoor via window.location.hash.

I meant the key to decrypt the user's own folder. That has to be stored on the server so you can access your files from anywhere. But yes, share files/folders keep the key in the window hash, but that's not what I was talking about.

That isn't entirely true. You could easily develop a web extension that validated the hash of the scripts being sent down the wire. Then you'd have an equivalent security guarantee to a normal desktop client.

Or you just inject your own payload and pad the payload until it has a hash collision with the original. At 128M hashes per second you can probably find a payload that works within a day. Since the hashes are published, usually, you know your target so it's just a matter of time.

You do know how hard it would be to find a SHA-256 collision? We haven't even found /one/. Sure, if you used MD5, you could expect a collision, but SHA256?

I've seen them. Just recently I had a short burst of collisions. It happens. Don't underestimate the resolve of someone who wants to grind an axe.

Lol, no. You have not seen a SHA256 collision. One has never been found, by anyone, ever.

Thanks for calling me a liar. I see the quality of people on this thread has suddenly dropped. So, stick with respectful discourse or keep it to yourself. You are clearly not in the industry, just another useless troll.

I think the best solution is PGP, it's more secure and is more mature - the issue is that it's a chicken and egg problem. There is no point me sending a PGP encrypted email unless I know the person can or is willing to decrypt it. It's also not very usable for non-technical people (i.e. the people who can barely tell the difference between the Googles and the Facebooks).

It would be great if a popular mail client came out that supported PGP as a first class feature [0], initially just signing messages to get awareness and support out, and then end-to-end encryption. I suspect that'll just lead to governments putting even more pressure on banning encryption though, e.g. in the UK where you can be jailed even if you legitimately forget or don't know the decryption key.

[0] Google released a plugin for Gmail a couple of years ago, but I mean supporting and enabling it by default. There's also the issue of whether a web application is really secure, so an open source desktop/mobile application is a must.

Don't fall into a trap with trusting PGP. The advent of high speed factoring from quantum computers (U. Toronto) means government agencies can get your key within a tractable time complexity. One algorithm is not enough to make you secure. The system of crypto algorithms with key management is the only system you should trust.

Lavabit case would happen.

I've seen some opensource implementations of Protonmail's stuff, any comparisons?

so what happens if you run this on Amazon (or any other cloud provider that would cooperate with govt intrusion)? do you need your own servers to make it work as intended?

Fun. Don't believe that an open source version of Magma makes it more secure. It just means this version you see is one that looks like the one used on a server. What is actually on the server may not be the version you see in a public repository. Thanks for sharing Magma!

can anyone speak to the value of the DIME spec?


Relies on DNSSEC or CA to validate keys (pp. 22). Not a good idea of your threat model is governments.

Defines fairly arbitrary list of non-extensible metadata including gender, alma mater & political party (pp. 60). Seems like this should be extensible and predefining a lot of that is short-sighted.

I have no clue how this works but could people not just host their server outside the jurisdiction of these assholes?

No. Jurisdication is irrelevant. The transmission has to pass through local jurisdiction servers and wire and thus falls prey to those laws. Hosting in Aruba for your US customers does not provide any veil of safety from prosecution.

the repo wasn't public before?

So Ladar Levinson closed his company because he refused to provide a backdoor to his customers email encryption ?

This seems similar to the Apple case, was Tim Cook just too big to bully ?

Snowden had his own encryption or used GPG so a Lavabit backdoor encryption key would not lower the entropy of Snowden's encryted emails.

Was Levinson's gag order lifted ? Why did Lavabit have to close but Apple didn't ?

[EDIT] Levison was not jailed.

Ladar Levinson was never arrested or jailed, and he chose (to his credit) to shut down his own company instead of complying with a court order.

He complied with previous warrants, right? And since he didn't use PFS, when he finally did hand over his key, it put all previous requests at risk. The problem was representing more security then he actually had.

The security provided was fully within the threat model presented, which never considered this type of attack.

He does comply for the record. He've just sent his key on printed paper in place of usb stick.

The key was printed using a font that was too small to be readable by any means.

I thought it was readable, just very unwieldy.

You can't really jail Tim Cook if all the engineers quit because they refused to write a backdoor

Would they?

Big difference between 'might' and 'would' there. My point is that if the company wanted it, it would not matter if some engineers had a problem. Fortunately the company stood up since this case was never truly about this one particular phone.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact