Hacker News new | past | comments | ask | show | jobs | submit login
Why your 'A' grade SSL is 'outdated cryptography' on Chrome (certsimple.com)
139 points by nailer on May 14, 2015 | hide | past | favorite | 83 comments

(Author of SSL Labs here.) From Chrome's perspective, it's easy to complain about obsolete cryptography because it takes only its connections into account. When you take a wider view and include other browsers, all sites today effectively must use obsolete cryptography because client-side support for authenticated suites is not yet strong. Even Google would have a "uses obsolete crypto" warning because it negotiates CBC suites with, say, Safari.

Non-obsolete crypto is relatively easy to achieve with Chrome and Firefox, but not as much with IE and Safari, and many other clients.

There's no doubt that sites with authenticated crypto should get a better score, but it's not straightforward to establish fair criteria. For example, should SSL Labs penalise sites that don't use authenticated encryption with IE? Because, to do that, you must either accept the DHE key exchange or deploy with dual (RSA and ECDSA) keys. The former is slow, while the latter requires more effort.

You could say that SSL Labs should focus only on the cases where it is possible to use authenticated encryption, but that creates another problem: should SSL Labs report the reality, or award best effort? It's not easy to decide, but so far I've been leaning toward the reality.

Finally, there's the case of using ChaCha20/Poly1305; should SSL Labs encourage the use of cipher suites that are not yet standardised? Should Google get a pass because we "know" they know what they're doing?

I don't understand why AE ciphersuites can't just own the 'A' grade, and everything else be a 'B' and below. You are currently awarding an 'A' "for effort". You should stop, now. We're all adults here.

And yes: you should be encouraging Salsa/Poly1305. They're de facto standards, and will very soon be de jure standards as well. Plenty of other "nonstandard" behavior is encoded into TLS. It's also not fair to call Salsa/Poly1305 a "Google" initiative.

There are still many clients out there that do not support AE ciphersuites (e.g. literally every version of Safari). If SSL Labs capped the grade at B for supporting these clients, then the top grade at SSL Labs would effectively become a B, as no serious site would get an A grade anymore. It wouldn't actually accelerate the transition away from these clients/ciphers.

Making the effective top grade a B might also reduce the physiological motivation for improving a seriously bad TLS config. At least in the US, anything less than an A is considered very close to failing these days (it's silly, but it's the reality). Making changes just to get a B isn't nearly as motivational as making changes to get an A.

Those older clients are actually vulnerable to real cryptographic flaws in TLS.

Which flaw, specifically, would the latest version of Safari be vulnerable to?

I guess it depends on your confidence in the Lucky 13 patch, right?


And to be clear, I'm not saying we shouldn't be moving away from the CBC ciphers as fast as possible - just that dishing out a B grade for supporting a large number of users is not a good way of accomplishing that.

You and tptacek are both right, so how about the following to combine your positions:

* Grade 'A' requires AES-GCM and ChaCha20Poly1305 * Allow CBC for 'A' iff it's a last resort option * Use nginx equivalent of "ssl_prefer_server_ciphers on;"

CBC would then only be used by clients that don't support AE ciphersuites.

Hi Ivan! Article author here, your book is setting on my desk as I type this. The article is simply meant as an explanation to users.

I think not negotiating GCM on a browser where GCM is available and not using it will produce a warning should cap the mark at a B.

Understood re: IE.

Agreed re: ChaCha20/Poly1305, at least until openssl ships it (and maybe longer).

Why would "OpenSSL shipping it" make a difference? It's not like Poly1305 is the only MAC available in the "modern cryptography" bucket; you can use AES-GCM as well.

That was a specific response to:

> Finally, there's the case of using ChaCha20/Poly1305; should SSL Labs encourage the use of cipher suites that are not yet standardised?

Ie, I don't think it's worth encouraging CHACHA20/POLY1305 until people can use it.

They can, if they're using Chrome.

We need to get away from the idea that it's unfair to recognize superior cryptography merely because it's currently only implemented in Chrome.

The grade either reflects the actual quality of a server's TLS configuration, or it reflects something else.

In any case, it's a moot point here, because ChaCha20/Poly1305 is RFC7593.

I am not completely against de-facto standards, but, in the TLS space, Chacha20/Poly1305 is not one. At this time it's mostly used only by Google/Chrome (AFAIK, no other clients support it); we'd need to see better browser support (multiple organisations) and wider server-side support before we can accept it on the basis of widespread use.

Accepting non-standard crypto is a slippery slope, which is why I'd rather take a very conservative approach.

P.S. You mean RFC 7539, which came out only a couple of days ago. Now we only need another RFC for the use in TLS.

Why exactly do we need to see other browsers support Chacha20? That's a viewpoint I imagine is shared by zero working crypto engineers; am I wrong?

They can, if they're using Chrome and the server has compiled OpenSSL from source and applied the relevant patches, linked to in the article, required for OpenSSL to support it.

Conversely, people are unlikely to be able to use chacha20 until we start encouraging its use.

Sure, but we can't encourage it's use until people can get access to server software that includes it, and openssl - and everything that links to libssl - doesn't right now.

Cloudflare implements Chacha20/Poly1305 using OpenSSL, and published their patches.

I know, that's why I linked to those patches in the article, then subsequently mentioned their existence in another response to you three hours before you wrote the above.

That doesn't change that people won't use it until it's in openssl properly and they don't have to maintain a patched version themselves.

I don't understand why a sober assessment of people's SSL implementations should account for stuff like this.

There are two options: enable the only modern native stream cipher available for TLS, and with it the only polynomial MAC that doesn't require hardware support (and thus the best polynomial MAC available for mobile devices), or don't.

The former option is superior to the latter option. The latter option is easier, but: nobody said engineering was supposed to be easy.

Either SSL Labs is evaluating the quality of TLS implementations, or they're evaluating something else. Arguments like the ones I see on this thread suggest it's "something else".

Encourage people to use a library that includes chacha20...

That's pretty reasonable, if libre or boring started using it (especially since they both have other advantages over open).

I don't know if it's planned with the lets-encrypt tool but these things should be really be done automatically. It feels a bit like downloading manually .tar.gz files to do system updates instead of a standard update tool. A system update should update the list of recommended ciphers for OpenSSL and the webserver should just grab the list from OpenSSL directly, there should not be any need to modify any configuration file. It's really exhausting to keep-up with the current state of ciphers all the time. People are pointing out the cost of certificates but in reality, the complexity of all of this is the real barrier, it's far more complex than it should be.

CertSimple (specifically me) patched node (specifically, io.js, which will become node again) as part of writing the article, so after the next release you won't need to do anything.

Other platforms should also keep up to date with browser changes. Cipher suites are a balancing act between compatibility and security, that's why Mozilla's Server Side SSL let's you pick the compatibility you wan. But that doesn't mean eg, Apache or nginx couldn't provide better defaults in newer releases (they may be planning to do so already).

Reply to realusername's comment below (rate limit):

Not offended at all! I think concentrating the discussion amongst people who have the time to look into this, then having those decisions flow down to projects as defaults makes the web more secure. Mozilla's https://wiki.mozilla.org/Security/Server_Side_TLS project informed the node changes: hopefully it will also inform the defaults for eg the next nginx.

Reply to derefr's comment (rate limit):

Mozilla Server Side TLS is that set of agreed upon conventions. The Moz logic is great, since the conventions produce 3 sets of cipher suites with different compatibility / security tradeoffs.

We made https://www.npmjs.com/package/ssl-rsa-strength as a library implementing Mozilla's logic a little while ago.

Sorry if if was offensive, it was not directed to your work, you are really making the web more secure with these patches so I'm really grateful ! It's just that sometimes, the current usability state of crypto just make me a bit sad.

Another way to think about it: if there's a use-case where the API OpenSSL provides (requiring that the consumer specify a list of cipher-suites, etc.) is wrong, then that consumer should not be directly consuming OpenSSL; that level of abstraction is the wrong fit for their project.

What we need is a "library" that just hardcodes agreed-upon conventions on top of OpenSSL. Import that instead of OpenSSL, use its API (which would be mostly transparent calls through to OpenSSL using the current best practices), and everything gets taken care of. New best practices? Updated library package.

Mozilla Server Side TLS is that set of agreed upon conventions. The Moz logic is great, since the conventions produce 3 sets of cipher suites with different compatibility / security tradeoffs.

The idea of a library to implement that logic is solid: we made https://www.npmjs.com/package/ssl-rsa-strength as a library implementing Mozilla's logic a little while ago.

Are you uncertain about whether OpenSSL's APIs are suited to their use cases? Let me tell you, every API that OpenSSL provides is as wrong as it can get. It has the wrongest APIs that have ever been perpetrated.

I appreciate Google being aggressive in this front, but they've trained people, like my mom, to ignore the little green lock in the last 6 months.

It's worse than that. They've trained users to ignore the RED "X" on the little lock and the RED strikethrough on the "https" that Chrome shows for sites with SHA1-signed certs that expire after 2016[1]. Reasonable or not, some sites can't or won't update their certs and have no choice but to tell Chrome users to ignore those warnings.

[1] http://googleonlinesecurity.blogspot.com/2014/09/gradually-s...

The biggest problem is that security is not a boolean, and yet we've conditioned users to think that it is. Instead of educating them on the different types and levels of security, we're giving them a black-and-white assessment. This makes it harder for them to make an informed decision.

Expecting people to get training to have a nuanced reaction to symbols used in security is a lost cause for web browsing. The mystique of the web is long gone. Its an every person thing now and reduced to binary decisions.

Web browser developers need to make some decisions to make clear what is safe, probably safe, and an threat. Lumping wrong, but probably safe things in with threats means that users will ignore threats. Look at the history of Windows and users just clicking dialogue boxes without reading.

You don't have to ignore the warning. It means that what you see on the site might have been recorded or modified in transit, and what you type in might be recorded or modified on the way back to the server. You might decide that it's fine for browsing but decline to put in personal information on those sites. It's up to you.

The problem is that outside of our niche of people who really care about tech, most people just care about the contents of the webpages they visit. They aren't reading/understanding the warning and have no context for making that assessment of safety and responding appropriately. They just know that they want to go to their bank's site or play the latest Mafiaville or whatever.

These changes in Chrome are teaching them that the red marks on the location bar are just a normal part of those interactions.

Yeah, I don't think my Mom is going to be able to make the decision.

Do you have any sources showing that the incidence of ignoring SSL warnings has increased?

It has always been a joke of a feature. If it's dangerous; block the website. If not; display it to the user.

Normal users don't want to deal with the technicalities. Presenting information to the user is just a way to "do something" while not actually taking responsibility and solving the problem.

How can you ever know what is dangerous or not for the user to do? Get some humility.

That's exactly the mindset that put us in this horrible situation. People would start warning against bad crypto much sooner if the consequences weren't as bad as browsers made them. We could even have warnings against plain HTTP already if browser security wasn't this radical.

Umm, 'danger' is not binary. There is no hard and fast rule that makes something 'dangerous'. There are levels of danger, and levels of risk. How can a browser know where that line is for every user and every site?

I think the idea that end users will avoid MITM attacks is just never going to happen, if it ever did. All this hair-splitting over SSL has long ruined this, which is a shame because that's the most common attack an end user will have. Worry that $nation_state_intel_services might someday crack your SSL session is a bit much. Google is a company run by, lets face it, paranoid web devs. In their minds, a potentially weak SSL cipher is just as bad as a MITM attack. I think the reasoning is foolish, personally, and we're in a period where Google's web dev mentality rules the security roost. Focusing on trivial SSL issues really isn't making us more secure. If anything, its training users to be baffled and potentially less secure because they don't understand why all the random yellow and red x's mean, especially on trusted sites like their employer's site.

For completeness, also be aware that it's possible to get (yellow and red) warnings in Chrome even if you're serving a full SHA2 certificate chain and have the best possible configuration otherwise. Apparently, this is because Chrome outsources certificate chain construction and validation (different library depending on the underlying operating system), and they sometimes create SHA1 paths even when better paths are valid.

My understanding is that this happens because of cross-signing, where some intermediates are using SHA1. Just in the last couple of days I had users complaining about two different sites with certificates from two CAs.

Indeed. Debian is one OS which is affected by this, because they shipped an outdated version of NSS (the crypto library used by Chrome) which does the suboptimal path generation.

Yesterday, the latest version of NSS was finally uploaded to Debian Unstable, fixing the problem there, but Debian Stable is still affected, and will be until it's updated either through a security update or a stable point release. I plan to agitate for this if necessary.

Here's the relevant Debian bug report: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=774195

It really shouldn't be possible to get an SSL Labs 'A' with anything but GCM or Poly1305.

Doesn't that mean giving up old browsers though?

Are you the kind of person who sees a 'B' as a failing grade?

If getting an A requires things that are, for a large portion, not feasable then I consider that a failing grade, but it is the test that fails, not me.

Anyway I asked a factual question, I would like an answer.

The answer is "no".

No, in this case, it just means preferring GCM. If a browser comes along that only does CBC, the server will pick CBC.

No. It means you get a B.

Would you want to ride in a car with failing brakes?

Our "C" rated SSL is "modern cryptography" on Chrome - because we offer ECDHE_RSA/AES_128_GCM - but because we also offer RC4, because eCommerce, and bloody winXP users still existing, and still being a source of revenue. They do win a "UPGRADE YOUR BROWSER, SUCKER, YOU ARE INSECURE!" banner, but we will have to keep RC4 available for as long as Windows XP continues to provide >1% of our clients' revenue.

You don't need to use RC4 to support windows xp with IE8 - you can even get away with using TLS1.0 and above. If you need to support IE6 though, then yeah, you're stuck :/

The cipher list below (for apache) should work just fine with XP/IE8 and later, but still net you an A+ in ssl labs, with a bit of other work.


Remember that "modern cryptography" message in Chrome is presented per connection, so on your new browser you get the ECC cipher with that message.

But people on old browsers may be served RC4 and get "obsolete cryptography" as their message.

SSL Labs instead gives a holistic approach and thats why it considers the best and worst potential set up.

SSLLabs capping server scores for just supporting RC4 is kind of silly. The server gets to pick the cipher and the Finished message authenticates the rest of the handshake. So long as the key exchange portion is strong enough (and that's orthogonal to the bulk cipher, although if you're supporting XP, that's probably plain RSA and not ECDHE_RSA), the handshake has downgrade protection. You can securely[1] negotiate GCM with modern browsers while still accepting legacy things for older ones. Just order your ciphers right.

[1] I'm handwaving the version fallback. You'll want to also support FALLBACK_SCSV, which SSLLabs also checks for, until that thing is gone for good.

Most older SSL/TLS clients do support 3DES, the problem being that it is relatively slow.

Seriously, SSL is such a mess. I dutifully switched my product's website to SSL a year or so ago, and I can't count the amount of time I've spent trying to optimise my SSL settings, and trying to figure out what optimising them even means. I'm pretty technical, but like trying to run a correctly configured mail server, this is just too hard. So I end up not worrying about my grade and probably end up using old, obsolete encryption because of it.

We're going to lose this fight unless we can make this easier. I mean, I care and I should be capable of working this out, but I'm no security expert - it's just too much work and most of the information is totally incomprehensible to me so I end up not bothering. I'm sure I'm not alone.

I've had a lot of good experiences lately using SSLMate (https://sslmate.com/) to automatically buy, download, generate chains, and renew certificates.

Then, I use Mozilla's SSL configuration generator (https://mozilla.github.io/server-side-tls/ssl-config-generat...) to set up the cypher suite.

Those two together will have get you running very quickly. It's certainly a good idea to know what you're dealing with, but there are smart people behind both of these services that really simplify things.

This area is a mess, there is now a difference between all major browsers and even between Firefox versions (With the Developer version complaining where regular FF does not).

For example firefox mentions SHA-1 insecurity when visting the google jquery CDN. [1] Chrome doesn't seem to care however and there are other sites where it is vice-versa.

[1] https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.mi...

It's frustrating that Chrome and Firefox don't give any details about why the connection to an insecure site fails. You just get a "connection reset" page in Chrome, and a generic SSL security error in Firefox.

I was deploying a virtual appliance and was baffled as to why I was unable to load the configuration web page from both Chrome and Firefox, until I visited the vendors web page and downloaded a hotfix that removed the SHA-1 certificate. This was an appliance released in 2015. It's a shame that the software vendor is releasing insecure virtual appliances, but it's also a shame that Google and Mozilla can't get their act together enough to display an informative error message.

How about "The site you are attempting to visit uses weak cryptographic algorithms, so the connection has been blocked."

And, please give advanced users a way to click through anyway. If this is a virtual appliance I'm deploying in an isolated network and I'm trying to access the administrative page, I should be allowed to get there, without some nanny state browser manufacturer determining that my crypto isn't safe enough.

> And, please give advanced users a way to click through anyway.

This already exists, and you can find it by reading through the source of Chromium.

Hmm, when Firefox errors for me it shows why. The usual reason is an expired certificate or a certificate for different domain.

It would have nothing to do with the cert being SHA1.

> With the Developer version complaining where regular FF does not

That's the right way to do it.

The Developer version should start complaining about outdated crypto years before the mainstream one. It should complain just after it's shown broken, and something better is available. Even if any attack is too expensive to perform.

Oh I agree, especially since the Dev version is usually a version ahead of release. (E.g. currently dev is 39.0a2).

It's just frustrating that there are so many different configurations to keep track of, it just feels like the whole hosting / cert industry has not handled this particularly well.

I've noticed the same thing re: Chrome not caring about subresources. Eg, we use Mixpanel, and mixpanel uses Akamai, and Akamai use CBC which is considered obsolete, but that doesn't trigger an 'outdated' warning.

I asked David Benjamin from Chrome and right now, sub resources aren't included in Chrome warnings (he thinks they should be though):


I may be mis-remembering here, but isn't Akamai on a special whitelist in firefox and chrome not to complain about CBC use?

Author here. This actually came from a real case: it was fairly surprising the SSL Labs test wasn't flagging the ciphers that cause the 'outdated cryptography' warning.

Thanks for this blog post. I was attempting to fix up a server last week and couldn't find or figure out a fix for it at the time.

I think this is the experience of a lot of people and good security folks are taking notice to try and help: https://blog.digicert.com/understanding-the-google-chrome-co...

That article is about SHA1, which is a different topic.

I'd recommend checking out https://cipherli.st for practical information on ciphers. It's quite easy to achieve an A+ by following their recommendations, once you understand the impact some of the changes might have (eg, using HSTS).

There's even a set on config for poor sysadmins who are forced to support legacy clients on XP.

Happy to answer questions as well - I do this for work.

Reminds me of a trick used by creationist high school textbooks.

They cover (in little detail, and with some errors) evolutionary theory, but they say it's obsolete and then give you the flavor-of-the-month creationist justification as "what's current".

While we are on the topic of A-grade security: The article might want to point out that using standard DHE is preferable to ECDHE if you can handle the additional CPU load. The problem with ECDHE is that all currently TLS-approved curves need to be considered unreliable. This will change if 'draft-irtf-cfrg-curves-02' is accepted and Curve25519 and Ed448-Goldilocks become part of the standard.

Can you clarify why the TLS-approved curves should be considered unreliable?

Mumble mumble NIST mumble.

That's the problem with tinfoil; no matter how much I wrap around my computer, I can never be sure if the government has suborned my local grocery store.

Very informative but not very educational. I had much more pleasure reading what they referenced: https://wiki.mozilla.org/Security/Server_Side_TLS

Reading this article and the comments here I'm stuck by how complicated (at least non trivial) the whole thing is.

At the same time we are trying to push every website (including blogs, little-league clubs, joke sites) to https-only.

Secure connection: fatal error (40) from server.

ECDHE or gtfo :(

Chrome changing the rules again? Unpossible! Still, the good news is that people are leaving chrome for ff, safari, new ie, so soon, it will be less of a problem. Hurrah!

You don't think forward secrecy is critical to secure communication? I'd rank it as more important than the SHA1 weakness problems.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact