Can anyone help decipher this, in terms of the concerns about owning my own computer and controlling what happens?
I intuitively distrust any vendor pushing this kind of requirement. This was probably caused by reading Slashdot back when the first TPM modules were introduced!
Do you believe we own our phones today, in a control sense? If not, you might not be open to my view. I think we are collectively accepting that the power tools-like capability of computers tends to be used against the general consumer more than they use it themselves, hence the move of some critical auth work into chips like the TPM/TEE/SEP. I think I engaged in some of the same Slashdot discussions back then, but anymore, I only see a cybersecurity war currently which we seem to be losing.
Could TPM requirements be the metaphorical camel's nose under the tent before we end up with a global immutable ID that spies on me and reports everything I do to enable a surveillance dystopia? Can I be blocked by some remote authority for good via remote TPM attestation requirements and denied the ability to execute code, make network requests?
The 'state' as it were doesn't seem to be able to take control of our computing from the current equilibrium of OS+hardware stacks available to the world, despite the predictions from TPM opponents. At least yet. Open-source OSes are arguably stronger than ever. Some of the corporate players have positioned themselves as some of the biggest roadblocks to state control, or transformed into security companies. The corporate-state merger of control over the consumer hasn't quite materialized the way it was predicted.
Complexity is way, way up, but the idea you own your own computer is still there, for a lot of definitions of 'own'. Are you willing to anchor your trust in even a single other entity? Will you run code you have no idea how to understand or the time/resources to audit? The complexity today seems so high and the amount of trust to get anything done so substantial, I don't know how relevant computing purity arguments are any longer.
I'm certainly willing to believe I don't "own" my phone today. I use an Apple device and until recently there was only one allowed way to distribute software. Still, so far as I know, it's not possible to run software as root (ie expressing ownership!) and people use jailbreaks to achieve more control.
In general I believe we're more likely to see 'soft' power: enforced through social mores, or just making something without options, creating a 'that's just how it is' situation. So I'm less inclined to believe that we'd be forced to accept things like remote attestation or global IDs.
..and yet. Those are possible, and would appeal to some governments, and we all know how easy to is to pass bad laws. Plus, they are extremely powerful: we could be surveilled and controlled far more than, say, the Stasi managed in East Germany. And look at what they were able to achieve in terms of control: imagine what could be done today. The idea is terrifying. I believe it's entirely possible to create a society where the power imbalance is so strong would not be feasibly possible for there to be democratic, freedom-oriented change.
Seems like a propaganda piece to justify DRM and remote attestation in the name of security.
> It provides a vastly more efficient and secure platform for Windows 11...
Efficient? Huh?
> to use through advanced encryption methods, improved industry standard cryptography, ...
Really? All of those are run on CPU or controllers on data path, not TPM. TPM is actually quite slow.
> and greater interoperability with other security functions.
What?!
--
About the only thing it does that a regular user may desire is storing disk encryption keys and secure enclave for application encryption keys. Everything else is desirable by Microsoft, the manufacturer, and Hollywood.
I'm curious what you mean -- corporations really require this to control employee machines? I (naively?) thought they can be locked down well enough as-is.
I don't know what companies do, so I'd love to hear more about this.
Require might be too strong a word, because they do have the built-in OS policy controls plus all the 3rd party security tools, but TPM-based security enables a lot when you manage PC fleets.
Up until recent releases, fleets of PCs were shockingly vulnerable to off-the-shelf toolkits to dump credentials, escalating to root privileges, lateral movement, installation of remote access toolkits, and of course ransomware is the end game du jour recently, but data exfiltration happens as well. That's just one example of a glaringly vulnerable OS that's probably a rough explanation of most ransomware news stories.
Ops groups are by and large happy to see Windows gaining hardening features we've been seeing in iOS/Android, which helps both reduce compromise but also authenticate employees easier when there's a chain of trust around identity involving the device the employee's authing from. A benefit is that official Microsoft features like these reduce user friction and security risk in tandem, but the TPM has ended up as the answer when you assume your adversary will get into your kernel without one.
Fundamentally, if you want to make sure the client system that connects to your network service running the software/configuration you want unmodified, you want some help from the client to provide some cryptographic attestation and hope the keys cannot be physically extracted from the TPM chip to forge that proof in some other way.
Now, let's say you have a policy that bans USB ports or requires some corporate spyware/antivirus to run and not get disabled by the user or else... this is the sort of stuff that IT desires.
I intuitively distrust any vendor pushing this kind of requirement. This was probably caused by reading Slashdot back when the first TPM modules were introduced!
reply