Hacker News new | past | comments | ask | show | jobs | submit login

One of the huge questions is : who takes responsibility for the issues with the black box learning systems ?

In France, since you have to be able to explain in plain words the algorithm that was used to help to make an administrative decision, it indirectly makes these black box learning systems illegal.

(And yet they probably already have been used by the police?)

Now, the naive response is that the engineer that implemented a system should take responsibility for any harm done by the system - and I guess that if these engineers could lose their license to use a computer and face jail time, they might also start to put pressure upstream on researchers to provide better systems ?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: