Hacker News new | past | comments | ask | show | jobs | submit login

Agree to disagree. One of the main “enforcement arms” of ethics research is to say the unsaid things out loud. Saying them introduces legal liability onto the company in future lawsuits.

I.e. “Google knew that X would result in Y and they still released X; they should pay for the damages of Y.”

The PR side of ethics research barely matters in comparison.




> Saying them introduces legal liability onto the company in future lawsuits.

My first thought was ExxonMobil when I read this comment.

> In July 1977, a senior scientist of Exxon James Black reported to the company's executives that there was a general scientific agreement at that time that the burning of fossil fuels was the most likely manner in which mankind was influencing global climate change.

> According to the Union of Concerned Scientists, "The funding of academic research activity has provided the corporation legitimacy, while it actively funds ideological and advocacy organizations to conduct a disinformation campaign."

Nothing has happened to Exxon Mobile. None of the executives are in prison as far as I know. You'd think the company would be bankrupt by now...

https://en.wikipedia.org/wiki/ExxonMobil_climate_change_cont...


Humanity has a fair history of introducing products that are later shown to have large externalized costs: tetraethyl lead, PVC, etc.

I interpret your point to be that publicly investigating the harm of future products---by the only people who can do so---is a bad idea. What is the alternative? Regulation? (Gasp!)

Perhaps we should start the disinformation campaigns before the products are developed. "If you are arrested on AI-provided evidence, you are guilty. Period."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: