Hacker News new | past | comments | ask | show | jobs | submit login

I'm gently surprised that people write buggy software without discipline when they're dealing with something that can kill many people.



I have some small experience with this...

My impression is that you end up with bad code in critical systems due to the exact same forces as anywhere else.

Difference is, rather than ramping up the quality of the development / testing with the seriousness of the application, people tend to ramp up the requirement hair-splitting, ass-covering and accountability obfuscating.

Fighting this requires a rare, uncompromising attitude that often isn't conducive to remaining employed.


That's my experience too.

In addition to that, there are hardware companies that have always seen software as necessary but uninteresting, thus trying to cut costs on it as much as possible, and not knowing how to set up an environment conductive to good software development (like having non-programmer physicists code in C, or still being stuck using 80's source control tools).


This reminds me that I should start asking Therac-25 questions in interviews.


Searching for that led me to this website about a variety of serious bugs.

http://www5.in.tum.de/~huckle/bugse.html


(A classic case study -- a radiation therapy machine that was badly programmed, resulting in several accidents where patients were given massive overdoses of radiation).

http://en.wikipedia.org/wiki/Therac-25


This is one of the reasons I opt-out of airport scans. The less I stand in-front of something emitting radiation the less chance there is for a software error to cause me to get a higher dosage than was intended.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: