Hacker News new | comments | ask | show | jobs | submit login

This is what HTTP Public Key Pinning (HPKP) protects against.

It allows a server to specify the only set of certificates that a browser should accept. Meaning that MITM'ing is impossible, without a valid cert in the chain of one of the advertised trusted certificates from the server.

Chrome, Firefox, Opera, Chrome for Android, and the Android stock browser all support it.

I'm not sure how they intend to circumvent this problem, apart from perhaps just instructing users to not use those browsers? That's quite difficult to put into practice.




No, it isn't. Locally installed certificates override pins; if they didn't, HPKP browsers wouldn't be deployable inside large companies that have regulatory requirements to monitor traffic from their own desktops, and there would be no benefit, because an adversary who can install software or reconfigure your machine can defeat pins in a variety of other less transparent ways.

HPKP is great, but it doesn't address this problem.


I've posted this idea elsewhere, but it seems relevant again. What about, as a compromise, adding a new ExtendedKeyUsage bit for "TrafficInterception" that must be set on the CA (probably would need to be on the root and all intermediates) in order for HPKP to be ignored by the browser?

At the very least, this clarifies intentions and helps somewhat with situations like the Dell certificate where it's not intended for MitM.


As an aside, this is one of the reasons why I believe locally installed certificates overriding pinning has a quickly eroding shelf life... At some point, something bad will happen like this (or malware-based) and Chrome will raise their hand and say, "sorry enterprises, no more MITM at all, even for locally installed certificates."

After that, IE and Firefox will follow and crypto will no longer be so trivially subverted by enterprise organizations.


Response from Chrome engineers I've heard is that they won't try to fight this, because anything intentionally overridden locally is already a game over.

If they block custom certificates, then malware will patch the process to disable the checks.

And in this case the Kazakh government could say "For your safety, the Chrome doesn't work with our Internet. Please use our Khrome instead".


That's another arms race Chrome will lose, because the market for the kinds of enterprise configuration management and "host protection" tools that could override this policy exists and is very lucrative. Chrome would simply be begging enterprise security companies to release products to fuck up their browser.

I disagree with you.


Chrome already has an "enterprise" version, moving a less braindead corporate monitoring system then MITM certs into it would probably be a good thing.

It would also be a good way for them to start pushing a "two party consent" model for private wiretapping -- It's illegal for my employer to record my office phone because it's a violation of the other party's rights. Facebook has as much a stake in not letting employers monitor employee's social media use as the employees do.


Or someone just forks Chromium and releases Chromium For Enterprise.


Which again helps nobody, because forks of Chromium will inevitably lag on security fixes.


At my last job, my manager tasked me with finding a way to defeat Chrome's update mechanism for all of our employees because a new version had introduced a bug that broke our internal web applications. I disregarded his plan and just introduced a workaround for the bug, but the point remains that enterprise customers already consider it a value-add to freeze their software in time for perpetuity. Hell, my workstation was running XP until I needled IT to grant me a "premature" upgrade to Win 7.


Sorry, I meant that from the perspective of the Chromium team.


Indeed, I agree with the policy that if someone can control what's installed locally, they've already won.


Part of me hopes you are right, because I don't like seeing Chrome/Google throw it's muscle around.

Part of me hopes you are wrong, because I think encryption and security don't need to be opposing forces and MITM isn't required for strong security (though maybe for good GRC and audit).


Key pinning you can't override locally is DRM.


That's a super interesting way of framing the conversation.


Someone suggested using a special icon in the address bar to denote this kind of thing.


This would be smart. It doesn't need to be ominous, just informative.


You used the words install software for the second time and makes me wonder if the citizens of Khazakstan will be force to install an executable or just a certificate, because you make it sound a lot more scary than it looks like or I maybe I did not fully understand the repercussions.


It's pretty trivial to detect if a browser trusts your CA with client side javascript. Such javascript could be injected into HTTP pages and throw up a div covering the page until the certificate is installed (instructions provided, of course).


Ah, Thanks for clearing that one up! As much as I dislike it, that reasoning does make sense.


It would be great if the use of a locally installed CA were flagged, for instance, by a question mark next to the lock icon. That would at least make it visible.


> Locally installed certificates override pins.

Which is a blatant security risk, which should be fixed immediately.

> wouldn't be deployable inside large companies that have regulatory requirements to monitor traffic from their own desktops

I guess they don't use ssh?

This claim is complete nonsense, because you are conflating the installing of a certificate with a capability to override HPKP. All those businesses need is a way to add an exception to HPKP. This is no more difficult that any other IT-managed configuration.

The bug here is the assumption that installing a certificate always means HPKP should be overridden. This assumption is patently not universally true, as this Kazakhstan situation demonstrates.

> no benefit

Why is it that so many people seem to forget about the concept of Defense In Depth when one of the layers of protection is attacked?

A physical-key analogy: there is a decent chance the lock on your home's front door can be opened trivially with a bump key[1], which is an attack against the entire class of traditional pin-and-tumbler locks. The many homes that have such a lock can be entered in seconds. Does this mean that they shouldn't bother locking their front door? No - while it might be a good idea to invest in a better lock, forcing someone to bump the lock has benefits. Someone trying to enter your house might not have the right tool. If they do carry a bump key, that could have legal consequences ("burglar's tools").

Layered defenses help to reduce attack surface and raise the attack cost.

> less transparent

I fail to see how forcing an attacker to patch binaries or otherwise work around HPKP. Doing so will leave clear evidence that the system has been tampered with. On the other hand, a proper certificate has a small amount of plausible deniability.

> it doesn't address this problem

It doesn't solve the problem, but it should be a speed-bump that makes the attack harder, raising the cost of MitM.

[1] https://en.wikipedia.org/wiki/Lock_bumping


I don't care enough about this to argue about it. I see why people don't like that Chromium works this way, and I see clearly why Chromium doesn't let pins override local configuration. Meanwhile, the cost of bypassing pins if you can run code locally is very low, not enough to change my risk calculus. I can see why Google doesn't start an arms race over a trivial speed bump, and I can see why you might want the speed bump.

If you want to be outraged about it, that's fine. I know other smart people who are also outraged about it.

Remember, though: we largely have Google and Chromium to thank for pioneering certificate pinning in the first place.


> I'm not sure how they intend to circumvent this problem, apart from perhaps just instructing users to not use those browsers? That's quite difficult to put into practice.

Even if pins overrode locally installed certificates, all they would have to do is to block all outgoing raw HTTPS traffic. All these browser-side security mechanisms can do is to refuse to initiate insecure connections (and inform the vendors about broken pins). They can't force a network that is actively designed to forbid private connections to allow them.


It is a little more complicated than that. If you start doing MITM on https connections where pinning is involved, typically those sites / apps will just stop working as they don't trust the CA for the cert that is injected during the MITM. So yes it 'protects' you, but it does so by not letting you access that page / app. Chrome (and I'm sure most other browsers / apps) can have their pins overriden by user installed root CA's (which is what they are pressuring people to do in this scenario).

You can read more about how Google does certificate pinning here: https://www.imperialviolet.org/2011/05/04/pinning.html


Could kazakhstan take "national security cert" traffic, crack it and then apply a different, globally trusted cert? Couldnt they also strip the public-key-pins header from incoming traffic?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: