Hacker News new | past | comments | ask | show | jobs | submit login

Reading just the syllabus, I was surprised to see no mention of accountability. Quick Ctrl+F searches for "accountability", "appeal", and "review" gave no results. "Reputation" appears, but in a section rather harshly titled "Herding and Conformity", about the reputations of the people not trusting algorithms, not the people making or deploying them.

In my own experience, human forecasters and decision-makers tend to be much easier to hold accountable for bad forecasts and decisions. At a minimum, they stake their reputations, just by putting their names to their actions. With algorithms, by contrast, there's often no visible sign of who created them or decided to use them. There's often no effective process for review, correction, or redress at all.

The fact that high-volume, low-risk decisions tend to get automated more often may partly explain this. But it may also partly explain general attitudes toward algorithms, as a consequence.






"A computer can never be held accountable, therefore a computer must never make a Management Decision." (1979)

My only problem with your comment is that human forecasters and decision makers are also often not held accountable for their work.

They are either tarnishing their reputation or gaining reputation points.

Under that interpretation of accountability, you could easily hold a machine accountable for decisions; if it loses enough "reputation points" such that people no longer trust it to make the right decision, the machine could be replaced.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: