Hacker News new | past | comments | ask | show | jobs | submit login

> The main exception to this, I believe, would be for critical systems where compromising operation through software modification presents too high a risk. Examples I'm thinking of include:

    > certain medical devices, such as implants and insulin pumps
    > subsets of electronic control units for cars
These are precisely the opposite of what should be exceptions. If you have a pacemaker implanted in your body, you need the right to replace whatever software is running on it before the manufacturer goes out of business and takes the signing keys with them.

There exists no thing where the owner of the device should not have the right to replace the software it runs, and the more safety-critical the device the more important the right.




Disclaimer I don't have a pacemaker: As a biohacker, I think this is a really bad take. I regularly put things [1] in my body of questionable provenance, and then cut them out of myself without anesthesia when they don't suit me anymore, but I like being alive too much to mess with a medical device like a pacemaker. Pacemaker hacking sounds hardcore and like, respect to anyone who does it but I don't think it should be easy to flash one of those things because I'd really prefer nobody do that to me in my sleep.

[1]. https://dangerousthings.com/


You pretty obviously don't want to mess with it frivolously. But if there's something wrong with it, and you have to fix it? That seems better than the alternative where you can't. Note that the right to modify it doesn't imply that you're required to in the absence of any reason to.

Also, if someone wants to kill you in your sleep, they... don't need you to even have a pacemaker. And the security of medical devices is notoriously bad, so if you're worried about that sort of thing, be more worried that the status quo doesn't allow you to fix the existing remotely exploitable wireless security vulnerabilities.


> existing remotely exploitable wireless security vulnerabilities

That's just it though, in my opinion being able to flash the thing at all would count as a remotely exploitable wireless security vulnerability. The first thing I'd do if mine was flashable is lock it down to make sure it was no longer flashable. Does that make sense? I might not be articulating myself well here.


Removing the ability to flash it seems like a bad idea. Suppose they find a bug and release an official patch. You want it so you don't die, right? The alternative to flashing the one that's in you is that you need chest surgery again to replace it.

If it has a mechanism to flash it then they can give you the password for yours so that you can always do it yourself (or have someone do it) in the event that the manufacturer goes out of business before anyone finds the bug.

And if you really want to remove the ability to flash it, you could use your right to flash it to remove that feature, whereas the status quo is that it supports it -- insecurely -- and you aren't allowed to change it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: