Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Web Encryption Gets Stronger and More Widespread: 2014 in Review (eff.org)
76 points by erkose on Dec 25, 2014 | hide | past | favorite | 10 comments



HTTPS Everywhere forces large sites to use content delivery networks which act as a man-in-the-middle for so-called "secure" connections. Cloudflare is doing this for at least 36,000 domains. We know Cloudflare is an interception point. (http://www.washingtonpost.com/blogs/the-switch/wp/2013/09/12...).

What the EFF isn't pushing is MITM detection. There are ways to do that, but the EFF is doing nothing in that area. This the TSA approach to security - lots of visible activity, with big holes. I call the EFF initiative Security Theater Everywhere.

Remember, the EFF brought us TrustE, which turned into a scam so bad the Federal Trade Commission fined TrustE. (http://www.ftc.gov/news-events/press-releases/2014/11/truste...)


I don't understand this comment. HTTPS Everywhere forces large sites to use MITMs? No, it forces them to implement good security, which MITMs are incompatible with. Companies just go "yeah we know that we need to be secure, but we can't be bothered doing that because we like our CDN too much, so we'll just put everything at risk with horrible hacks".


"I don't understand this comment."

A content-delivery-network using HTTPS is a MITM. Decryption occurs at the CDN, and everything is in the clear inside the CDN's operation. That's a good place for interception, lawful and otherwise, because so much can be captured there.

It's a scaling problem. If you only use HTTPS for crucial items such as logins and credit cards, the load is small enough that you don't need a content delivery network for those pages. A small number of high-security machines can handle the important stuff. The non-secure pages can go through a CDN, which can cache them.

With HTTPS Everywhere, if you're big enough to need a CDN, and use the same CDN for both secure and non-secure pages, you've exposed everyone's credentials inside the CDN. How much do you trust your CDN? Are you feeling lucky?

What we really need is less SSL and more page signing. There's a W3C proposal for attaching the secure hash of a page to its URL, and validating that in the browser. That guarantees the page you asked for is the one you get, while allowing caching. The caching operation can't change anything without the browser rejecting it. This catches tampering at the router level, the cache level, and the ISP (we're looking at you, Comcast) level. So you can have the performance gains of distributed caching without the risk of tampering.


>What we really need is less SSL and more page signing.

This may more realistically address MITM tampering attacks, but doesn't address the second concern the EFF and others are trying to mitigate, namely passive MITM eavesdropping[1].

CDNs are definitely a prime interception point if HTTPS can't be scaled more securely. To address the (hopefully short-term) scaling issues, I wonder if users will be asked to pick between a fully encrypted but slightly slower experience (anti-tamper + anti-eavesdrop), or a 'mostly secure' (anti-tamper only) and faster experience.

[1] https://www.eff.org/https-everywhere/faq


> There's a W3C proposal for attaching the secure hash of a page to its URL

My google-fu seems weak, any chance you could link it? It sounds like a great idea in principle (although you'll need buy-in from browser AND server vendors).


http://www.w3.org/TR/SRI/

It's called "subresource integrity". Mozilla and Google have reps working on the spec. Example:

    <script 
        src="http://code.jquery.com/jquery-1.10.2.min.js"
        integrity="ni:///sha-256;C6CB9UYIS9UJeqinPHWTHVqh_E1uhG5Twh-Y5qFQmYg?ct=application/javascript">
Anybody in the transmission path can cache that, but they can't change it. This improves caching performance in general, because you don't need timed expiration. Load JQuery-1.10.2 once, and never load it again unless it changes.


Well, they can change it if it and the resource are not served over HTTPS by changing both the hash and the response for the resource.

This doesn't add much unless at least the parent page is using HTTPS, else the hash can simply be stripped anyway. If browsers had an HSTS type precache list for sites that must use hashes, if they aren't using HTTPS then the resource and the hash can be changed.


One of the other things I learned this year is that if CloudFlare is serving http for a domain, they can get an SSL certificate for it [1]. (This is broadly in line with what you'd expect for domain-validated certificates when you think about it, but I hadn't thought about it)

Is there much benefit to having a CDN that /isn't/ serving over https, if the CDN can get a signed certificate issued automatically anyway?

[1] http://blog.cloudflare.com/introducing-universal-ssl/


Inclusion in HTTPS Everywhere is purely voluntary. No one is forced to do anything.

Yes, EFF, like most NGOs/non-profits, optimize for status and publicity. And yes, CloudFlare is a global MITM.

It's still better than unencrypted connections and no one paying attention. They're raising the bar for eavesdropping from being in the same coffee shop to breaching CloudFlare.


And this is only just the start. Hopefully the revelations this year and the subsequent work by privacy activists and security professionals and especially the EFF [1], have made the wider public aware of the extent to which their communications and lives are being intruded on.

Once we start seeing free ssl next year [2], hopefully this will be a step in the right direction to a more secure World Wide Web.

[1] https://www.eff.org/ [2] https://letsencrypt.org/




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: