Hacker News new | past | comments | ask | show | jobs | submit login

I've always been pretty skeptical of secure boot and trusted hardware modules since it seems to generally mean "trusted by the manufacturer, not the user". The model feels like the manufacturer owns and controls the device, and the user is just barely trusted enough to press the boot key.

Because of this I appreciated that the article stressed the importance of open-source firmware and called out companies like Intel for user-hostile approaches.

It's been like that since the beginning of "Trusted Computing" --- it was originally for DRM. Only within the last decade(!) has it been advocated strongly as a security feature, and my belief is that the companies (and the government) have realised that security paranoia is a powerful tool of control. Unfortunately most people won't question anything if it's "for security".

I don't think open-source is really all that important, and the article is being very misleading in that respect; in fact, if we don't have the keys, all that being open-source does is to allow us to easily see how they're oppressing us. (Of course, there's also the Ken Thompson Hack --- inspecting the binary is the real way to determine if there's anything unusual.)

This is a related article which everyone interested in this topic should read: https://www.gnu.org/philosophy/right-to-read.en.html

I long back to the days when we said security is compromised when a adversary had physical access and that zero point many zero's 1 dollar jumper physically setting things to read only works just fine. Today I primarily don't trust my systems because of the manufacturers seeing me -the owner- as the #1 security risk and favoring corporate interest over client interest.

Even inspecting the binary doesn't prove anything in general. Halting problem...

You're right to be skeptical, but for the wrong reasons. As others have pointed out, it isn't 1984. Computers are too complicated for users to be trusted.

The real problem here IMHO is that just as the OS is horrendously complex and cannot be left to the user to configure properly, trusted hardware is also broken! For a very recent example, visit https://tpm.fail . The TPMs mentioned in that disclosure passed rigorous CC EAL4+ and FIPS 140-2 certifications. So, even the certifications fail to protect against the very flaws they are testing for. (I haven't studied the matter in detail to determine if the testing regime itself is weak, or if there's a Boeing/FAA level of corruption, or something in between.)

For another recent example, javacard has been proven weak in certain use cases.

The big problem with these hardware flaws is, you end up putting your absolute faith in them since they form the TCB. When the hardware is eventually (and almost certainly) found to contain a flaw, the entire rest of your security tends to fall apart, and generally you are unable to repair it without replacing the device entirely. This might be ok (you will eventually replace the hardware through normal obsolescence) or not (embedded [in your body] medical devices).

What I like about the HP proliant platform is the the TPM chip is an add-on card.

At the same time, a lot of modern security discussion is centered around the user being tricked or guided into making mistakes or that it's hard to differentiate between a user or a malicious program.

For example, forcing automatic updates with no postponement because users simply won't apply security updates otherwise. Taking freedom away from users? Yeah, kinda. Increasing security? Absolutely.

Most users lack a deep enough understanding of security to protect themselves. Also, even if they do have knowledge, they might still ignore it in order to install games or watch dancing pigs or whatever.

I think Google offers a good compromise with the screws in their chromebooks that allow overriding the bootloader.

They mean that because that’s what they were designed to do. In big corporative networks it’s the non tech-savy users that are the weakest link in the chain, and TPM is a way to provide guarantees.

One of the drivers of all this technology is that big business wants to be able to sue providers when everything breaks and avoid the “your users did that” excuse.

Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact