Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd actually recommend just get the exact certificates you're going to work with. For example, if I'm using GitHub's API, it makes sense not to check all the chain, but just keep the GitHub's original cert.


Disclaimer: I don't particularly agree with the Certificate Authority mechanism that we currently use with TLS.

However, given that it's what we currently have, I'd strongly advice taking advantage of the security that it provides. Requiring API client library authors to ship certs will make for poor security. Not only do certificates expire, they also get compromised.

It would be easy to conduct MITM attacks using revoked certs and API client library users would be none-the-wiser. Instead, it should be the responsibility of HTTPS client libraries to use the latest cacerts data and support features like OCSP [1] for validating certificate revocations, etc.

[1] http://en.wikipedia.org/wiki/Online_Certificate_Status_Proto...


Taking a validated known-good cert and, from that point on, simply verifying that the cert remains bit-for-bit identical is called "certificate pinning", and while it does create problems with revocations, your service is going to break as soon as Github revokes their certificate anyways, so it's not like you won't notice.


The problem is that the CA mechanism (in particular, blindly trusting whatever list of root certs your vendor ships) does not provide security. I cannot go and validate the practices of those CAs, and all it takes is one of them to get compromised (which they regularly do, have you taken out the compromised CA's root certs that ship with OS X, for example?).

The only certificates I actually trust are the self signed ones from my organization which I can actually go validate in person. While I have ZERO trust in any of the certificates that my vendor ships.


the problem here is rather than worrying about one particular cert getting compromised, you now have to worry about every CA in the world getting compromised, a much more likely possibility.

It seems the best course of action would be to trust only an individual cert, and check for revocation.

Also OCSP is basicly a joke, it works every single time, except when it matters (an attacker controlling your view of the world)


Yeah. I'll edit my post to add your points, but I can't do it right now


The problem here is that if your app is deployed in a corporate environment, it's possible (likely) that the corporate firewall is intercepting your HTTPS traffic and returning a different certificate, issued by the IT department.

So if you try to validate that the certificate is the specific one that your API server is using, it's going to fail in that scenario.

Depending on your app, you could just ignore that possibility of course.


How does your scenario differ from a man in the middle attack?


The word "attack". In theory, being in a corporate environment, this is a desirable man-in-the-middle rather than a hostile one. It certainly is man-in-the-middle.


It's exactly what SSL/TLS is designed to defend against, and neutering your apps & applying newspeak doesn't make it preserve the security provided by SSL/TLS.


Conceptually, in a work environment you aren't accessing the website, your company is. If your company chooses to add an SSL proxy for its own purposes, there's nothing invalid, wrong, or unethical about that. Conceptually, you're all functioning as one entity.

You may note I'm using words like "theoretically" and "conceptually" in these replies, and that's basically because ctz's point is accurate. It isn't hard for someone on HN to be more competent at SSL usage than the administrator of the SSL inspector. But, well, welcome to the corporate world. Can't live with 'em, can't live without 'em. But I don't think it's wrong on any moral or technical level, it's just potentially wrong based on more mundane considerations, like competence.


Nobody is saying it is. However, this is the reality in a surprisingly large number of corporate environments. I have to support a lot of enterprise customers in my day job, and working around corporate firewalls is a large part of the issues that come up for us.


So you still need to provide some mechanism to prevent malisous attacks. The point of validating certificates is to prevent this class of attack.


Yes of course. In a corporate environment, you would usually install the proxy server's CA certificate in your certificate store and validate that all certificates were issued by the proxy server.

My original comment was just pointing out that validating that the certificate you get when you connect to https://www.github.com in a corporate network may not be the same as the one you get on the open internet.

It's up to you to decide whether that's something you care about though.


Yes, but you're then reliant on the proxy to do the validation for you.


And for the proxy's private key (which you are henceforth relying on for all transport security) to be kept secure. Given my recent experience with products like websense, this is a very poor bet to make.


If GitHub's cert is revoked or expired, you'll have to manually go grab the new one. You want to trust the issuer of the cert.


That's great until it changes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: