Hacker News new | past | comments | ask | show | jobs | submit login

You have to remember what "calling to account" is. It is a demand that you explain yourself. In the case of a business venture, it means to present the books and detail the entries. A court, congress, or your boss can demand your presence at a meeting to "explain yourself". Accountability doesn't mean punishment, it means you are subject to demands to make an "account" of your self. Punishment is a separate thing from the account.

If your account implicates yourself in malfeasance, you might be punished. But that's good!. But there are other kinds of accountability. The FAA is very clear that you must make an accounting for yourself. But the FAA is also very clear that they won't punish you for your account. That doesn't mean you aren't accountable!

Most computer systems can not do this! They can not describe the inputs to a decision, the criteria used, the history of policy that led to the current configuration, or the reliability of the data. That's why lawsuits always have to have technical witnesses to do these things. And why the UK Postal scandal was so cruel.

Systems that grant actors immunity from accountability as a matter of policy are terrible systems that produce terrible results. Witness US prosecutorial immunity.






But, in that regard, systems can be made such that they can, in fact, be held accountable? You can design them such that they can list their inputs and why the outputs were set to what they are.

With the speed at which systems operate today, it is actually expected for many systems that they can operate before a person does anything, and that they do so. The world is rife with examples where humans overrode a safety system to their peril. (I can build a small list, if needed.) This doesn't mean we haven't had safety systems mess up. But nor does it mean that we should not make more safety systems.


Yes. Some systems can be made more accountable. They can provide traceability from their inputs to output, provide reflexive access to source code they run, and provide evidentiary traces for reliability.

Safety critical systems that operate faster than human reactions are not accountable. So that's why we never make them responsible. So who is? Same as for bridges that fall down -- the engineers. People forget that civil engineers sign up for accountability that could lead to serious civil or even criminal liability. Which is exactly the point of this aphorism.

Boeing was facing criminal charges, and is currently under a consent decree for exactly this kind of sloppy systems work.


I can agree that there is a lot of lifting for "management decision" on the page, but my point is pretty strictly that it is overstated. I'm largely used to this getting discussed much more generically.

That is, just as I am ok with AES on cars, I am largely ok with the idea that systems can, in fact, be designed in such a way that they could rise to the level of accountability that we would want them to have.

I'm ok with the idea that, at the time of that manual, it was not obvious that systems would grow to have more durable storage than makes sense. But, I'd expect that vehicles and anything with automated systems should have a system log that is available and can be called up for evidence.

And yes, Boeing was facing criminal charges. As they should. I don't think it should be a witch hunt for individuals at Boeing for being the last or first on the line to sign something.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: