Hacker News new | past | comments | ask | show | jobs | submit login
Physical Unclonable Functions,a unique, inborn, unclonable hardware level key (pufsecurity.com)
34 points by Nokinside on Feb 9, 2022 | hide | past | favorite | 22 comments



"PUFrt builds upon eMemory’s anti-fuse OTP (NeoFuse) and Quantum Tunneling PUF (NeoPUF) technologies to provide self-encrypted anti-Fuse OTPwith on-chip PUF. The inborn HUK utilizes the NeoPUF’s guaranteed randomness..."

With enough new buzzwords, all things are possible.

The idea here seem to be a security token that generates its own random key and never lets it out of the device. Reasonable enough. The claim is that this is resistant to physical attack to obtain the secret key. Yet that's an unlikely attack vector. If you have physical possession of the token long enough to extract its unique private key by elaborate means, why do you need to clone it?

The usual problems with such devices involve replay attacks, or remote access attacks. (A remote access attack is when a token is used to open one door, and the door has been compromised to pass through the challenge/response from another more important door elsewhere, allowing an attacker to open the more important door.)

This isn't relevant to DRM. DRM for broadcast media requires that there be many protected copies of the same decrypt key. This thing is for unique keys only.

It seems strange that such a device would have a JTAG port, even if it is "disabled after the manufacturing process".


JTAG is typically used for pass/fail binning and scanning of logic/memory at package (and occasionally wafer) test. Without something like that it's very hard to get to <1000ppm defect rates. Properly designed, you wouldn't expect the Unique ID to be readable/extractable even from an enabled port, but could certainly reverse engineering its operation more easily.


For a small, low-speed single-purpose device like this, a self-test function can determine if the whole unit is a reject. There's no assembly into a larger system, where you want to detect bad parts before you build up a board. For a pure security device, the less internal access, the better.


> why do you need to clone it?

http://people.csail.mit.edu/rudolph/Teaching/Lectures/Securi... gives 2 use cases.

I can think of a supply chain attack prevention use case as well.

I'm sure there are others. I also like the summary https://www.intrinsic-id.com/physical-unclonable-function/ , with low cost for IoT being the most important aspect.


You aren't wrong about the writing style. Reading it is a hard slog.

> The idea here seem to be a security token that generates its own random key and never lets it out of the device.

That is an idea. It's one that's been around since the first pin pads were developed a few decades ago that have since evolved into the TPM's we have today. But it's not the idea being pushed here. What's being pushed here is a way of protecting the key.

The answer to the question "if you have access to the key long enough to extract the key why do you need to clone it" is simple enough. It can resist giving up it's information until you prove you're you. Example: enter the wrong PUK in 8 times and it destroys it's secret key. It's another concept that's been around for literally decades.

There are stories about people going to enormous lengths to extract the keys out of similar devices that will resist merely probing them. The one that sticks in my mind was Mondex, https://en.wikipedia.org/wiki/Mondex, where a University used an election beam to reconnect some test pins (which I presume performed a very similar function to the JTAG on this device) to extract secret keys. That "hack" killed Mondex, although "hack" is probably the wrong word. Mondex put up a $1M prize, and the hackers won it.

There are also Youtube videos of people reverse engineering chips using little more sandpaper and a digital microscope. And they do that for little more than few seconds fame on Youtube. At the other end of the scale, the lengths people are willing to go to protect secret keys is extraordinary. Look up the price of HSMs some time. https://en.wikipedia.org/wiki/Hardware_security_module

So I say your wrong - a iron clad way of protecting the key really is a big deal. A cheap, true PUF (as they call it) would be a breakthrough. But after reading the article and the comments here - I have no idea if they've even moved the needle.


> This isn't relevant to DRM. DRM for broadcast media requires that there be many protected copies of the same decrypt key. This thing is for unique keys only.

Game consoles. Game consoles encrypt their boot disks with device-specific keys. They also have device-specific client certificates for authorizing access to game CDN servers (allowing console vendors to ban consoles by revoking those certificates.)


For cryptocurrency wallets, replay attacks aren't a possibility because they wipe themselves after a few attempts.


I wonder how much worthless stuff we'll have in the future because a cosmic ray happened to destroy the only copy of a vital key somewhere.


Apparently it happened to me when I took a laptop on a plane with a full disk encryption. After that it wouldn't decrypt anymore. I spent a fair amount of time googling recovery methods and nothing worked.


Just create a couple of keys for the same task. I think there should be encryption schemes that allow to use only one key from a set for the decryption.


I hope this technology ends up being an utter failure, because it seems way more useful for DRM than for any legitimate security. After all, the entire premise of it is to help devices keep secrets from their owners.


That's not really the premise. The premise is to keep secrets safe from malicious actors with physical access. The capability can be used for many things.

Honestly, if people don't want DRM, they're going to need to find a way to solve that problem that doesn't involve technology. It's a losing battle, especially since the vast majority of people don't care and would much prefer their device isn't horribly unsafe.

Hoping that this technology will fail is not going to be effective. It isn't going to fail, we've already got tons of examples of similar technology.

Instead, legal rights ie: right to repair are going to be far more effective.


> That's not really the premise. The premise is to keep secrets safe from malicious actors with physical access.

How can you distinguish the owner from a malicious actor with physical access, in a way that doesn't cause forgetting a password to immediately and permanently turn the device into e-waste?


You can't. I said the premise is to thwart attackers. The implementation can be used for all sorts of things.

Or you can but it's a lot of extra work? The point remains.


Of course its going to fail in consumer electronics.

For a majority of content, an 'unbreakable' DRM layer drastically decreases its value.

In our pandemic world i can simply say : The R value plummets.

Or in simple terms. Bob doesn't care if a show is mediocre, but he does care about what the other kids are watching.

But this shits going to dominate in the H(ardware)aaS b2b world.


How is it going to fail in consumer markets when it's been around for years? Every mobile phone is effectively locked - users don't have root and don't control the operating system.

And yet mobile phones are ubiquitous and users don't give a shit.


This seems like great engineering (I am not competent enough to comment on that side of things), but I wanted to point out that "physically unclonable" here seems more like marketing speak, not something fundamentally unclonable. What they have might be very difficult to clone, but unless it is practically quantum-mechanical, then the laws of physics permit its exact measurement and copying.


From what I've seen, PUFs tend to go for a level of security more like "unclonable except by a nation-state with physical access, a battalion of electron microscopes, and the willingness to expend person-years of effort". For example with a flip-flop PUF I think you'd need to reverse engineer miniscule imperfections in every single transistor.


As far as I can tell, it uses blown fuses to encode the key. I'd bet Ken Shirriff (or anybody with similar hardware) could get the key out and clone this.

Edit: my mistake, that's the old, busted being compared to the new, shiny. You'll need more than microphotography. As I recall, folks have done cool stuff inducing current with lasers, which could probably work here. Slightly fancier tech required, but I'm dubious that this is actualmy uncloneable


While this might have niche uses, I will say no thank you. Last thing we need is more tech for device attestation and DRM diminishing the property rights of buyers.


Could someone explain how this compares to something like ChipDNA? (I had a hard time parsing the substance out of all the buzzwords)


I am not sold on PUFs because the way I have seen them described so far raise all the red flags of bad crypto. It was funny to read the article that started with the Kirchoff principles because I was going to start a comment about just how PUFs were the most recent chancer having a go in a long line of systems designed to sidestep Kirchoff principles.

The steelman cases I think for PUFs are a) that it could be a solution for device attestation (theoretically, you can use the PUF as a secret diversification component in a KDF), or b) use it the way initialization keys are used in secure elements today, where they are the bootstrapping keys for a protocol that installs personalized device keys. (e.g. a temporary, weak attestation)

I agree with another commenter here that the use case for this is DRM, but the econmics of why to use PUFs over more dynamic and resiliant actual cryptosystems haven't been compelling so far. The cases I've read, "more security!" is both dubious, and past the point of marginal diminishing returns on increased security. Sure, you can never have too much assurance, but how much are you really going to pay for, and at what flexiiblity/switching cost?

For PUFs to be used in case b), they would have to be more economical than provisioning secure elements, and the certification and accreditation of the PUF modules would be a big part of those economics. Their failure mode is also catastrophic, as you can't update the PUF component across a device ecosystem, so the use cases are automatically narrow, like maybe disposable drones.

Everything about my reading about them so far as been designed to persuade people who are not equiped to reason about them, and sometimes discredit anyone who is. The hidden counterfactuals to me are that it depends on standards bodies approving one, as nobody serious is going to touch it otherwise, and then a customer is going to take on a PUF module vendor and have their entire product line and supply chain depend on the company that produces it. There are still precedents for that, as that's what companies like Gemalto, Oburthur and any other peers do, but those companies are such that it doesn't matter what their tech does because they are unique entities among tech companies. If one of them had a PUF product that was competitive with their other business, it would be just another tech that was effectively mandated on an industry top down.

It also seems to presume that the problems in security have anything to do with our ability to produce sufficient entropy/randomness (instead of verifying its integrity) and the reliance on a single or few points of centralized PUF key generation means it is guaranteed to be sabotaged by IC actors.

The conversation goes something like, "we have a physical secret thing nobody can ever find, but you can verify it!" and you ask, "how do I know nobody else knows it?" and they say, "you trust us!" so you ask, "can your insurance handle replacing my entire market cap?" and they say, "not necessary because of physics! where is your physics degree from again?" and it usually ends there, with a PUF vendor mystified at how their potential customer can make so much revenue but still be so stupid as to reject the brilliance of their discovery.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: