"Microsoft did a good job designing SDCP to provide a secure channel between the host and biometric devices, but unfortunately device manufacturers seem to misunderstand some of the objectives. Additionally, SDCP only covers a very narrow scope of a typical device’s operation, while most devices have a sizable attack surface exposed that is not covered by SDCP at all.
Finally, we found that SDCP wasn’t even enabled on two out of three of the devices we targeted." Oof.
How do I, as a consumer, validate that my fingerprint sensor uses SDCP?
Reading through all of this it seems that protecting myself by enabling cover tampering (which prevents a hacker from replacing the fingerprint reader without tripping the TPM) and only allow booting into Windows.
First sentence of this click bait horse shit:
Security researchers have found flaws in the way laptop manufacturers are implementing fingerprint authentication.
I dunno, it kind of seems fair to discuss what the thing actually looks like in the wild? Especially since to the average end user, the only information is "does this thing have the 'Hello' sticker on the case?" Like, yes, we need to talk about the actual details and the article should always lead with that clarification, but I'm not sure it's wrong to use that title.
Based on the Blackwing paper the "Microsoft Surface Type Cover" hack seems pretty legit. They can gain login access to a Surface-with-cover by simply unplugging the cover and plugging in their evil USB device. That sounds like a bypass to me, no horse shit required.
And this, ladies and sirs, why we should prefer Macbooks and Linux in security critical productivity work. Microsoft Windows and its hardware partners, with their device drivers and lack of transparency, have been a dumpster fire for the last 30 years.
lead researcher on this here. I've spent a big part of my career exploiting Apple, and I definitely wouldn't say Linux is better in security critical work.
Out if interest does this flaw impact Linux as well if we got one of these devices?
It sounds like it's protocol dependent with some specific Microsoft protocol, which I guess means it's unknown whether these fingerprint sensors are secure under Linux, but wanted to check in case this was investigated.
I'm curious what your thoughts are on potential solutions to writing 'secure' (enough to resist non APTs, assuming the human does not fall for phishing, just hardware/software vulns) software as an industry?
Slower releases? Formal verification everywhere? Demanding more than 3 days notice and vague requirements before we start implementing features? Unionization/licensure so devs can pushback when bosses of various stripes want to cut corners like they have gotten to their entire careers?
It is clear that the industry of popular commercial computing devices at large does not have the incentives to write 'secure' code and hardware as defined above.
What would it take to make the default state be secure instead of insecure?
My attempt at fixing incentives leads me over to UNIX and C which was a massive paradigm shift from the mainframes before and arguably is the foundation for modern computing at large today.
Trying to kill our old 80s insecure-unless-you-really-work-at-it UNIX & C-dynasty might be a good step but seems almost impossibly painful for us to do.
Like how medical best practices in some areas can only move forward one death of an old guard doctor at a time. We all (me included) are used to this von neumann architecture on one of several popular ISAs and doing imperative code pretending to be a PDP-7, plus or minus lots of abstraction, depending on where we go from embedded to Blazor WASM web programming.
Let's say someone does do that, a new OS from first principles, new programming language(s), an entire stack built on, IDK, ARM or x86 ISA on up. VC money demands they lock out making the device general purpose computing. Few would buy it because of their sunk cost investments in the cloud / SAP / activedirectory corporate networks / X gen enterprise offerings of the past that this would not be as easily integrated with.
Yikes. What will it take to make us as an industry write 'secure enough' software by default?
The Linux-side was actually part of the vulnerability here
"After comparing the enrollment operations on the Windows and Linux sides, we found that while enrolling via Windows follows the enrollment process specified in the SDCP spec, enrolling via Linux does not."
Apple certainly uses hardware partners to make components of their systems. Apple has an advantage in that they can charge more for their systems due to their brand value and reputation. Since the value of their products are high they have less of a need to cut corners and oversight to maintain margin. Chinese manufactures produce the quality to the spec you give them and you pay for.
I wouldn't look at Linux for security-critical productivity work. Most cybersecurity experts run Windows, with Linux in a VM. Or they run Macbooks because of battery life, etc. In this case, it comes down to the tooling that they can use to exploit vulns and so on.
Linux would be more private, but privacy != security and it all comes down to what your threat model is. The three major OSes have their own security mitigations that you can use to reduce the attack surface.
In general, in a multi-user networked environment Windows is going to be more secure than running Linux -- Linux works great as a server because it can be be run barebones thus reducing the attack surface on the OS.
"Microsoft did a good job designing SDCP to provide a secure channel between the host and biometric devices, but unfortunately device manufacturers seem to misunderstand some of the objectives. Additionally, SDCP only covers a very narrow scope of a typical device’s operation, while most devices have a sizable attack surface exposed that is not covered by SDCP at all.
Finally, we found that SDCP wasn’t even enabled on two out of three of the devices we targeted." Oof.