
The AI Community Needs Fo Take Responsibility for Its Technology and Its Actions - jaTix
https://cacm.acm.org/opinion/articles/241640-the-ai-community-needs-fo-take-responsibility-for-its-technology-and-its-actions/fulltext
======
_0ffh
No, companies need to take responsibility for what they sell, and how.

Also, unfortunately, when I hear "The algorithms pushing content online have
profound impacts on what we believe" I can't help but also hear "so we must
take control of these to help pushing our agenda. For the greater good![1]"

[1] (tm) Gellert Grindelwald

------
ncmncm
It won't, though.

Companies that use it won't, without law requiring it. Police departments
won't either.

------
darawk
> "There's no such thing as a neutral platform,"

This is a very weird, and pretty clearly false, statement. There is absolutely
such a thing as a neutral platform: A completely un-moderated forum is a
neutral platform. We may or may not like what the outcome of that is, but it
is neutral.

~~~
gmueckl
No, unmoderated online platforms can easily become biased via some dominant
groups of members that exhibit intolerance against views they don't share. The
result is implicit moderation.

~~~
simianparrot
The platform is still neutral, in your example, even if the community causes
self-censorship or self-exclusion to "outsiders" by its dominance.

The platform is the carrier, not the message.

------
roenxi
It is a pity that she mixed the gender issues and technology issues and it
makes a lot of sense why she normally wouldn't.

Is the focus here the technical observations or the #MeToo observations?
Because one of those is fairly noncontroversial and likely to promote
generally nodding and murmuring. The other is going to start a flamewar about
cases like Stallman's.

------
mam2
Indtead of wishing and throwing a tantrum when the world isnt what one would
like to be, i would very much like to propose another way, which is the only
one proven in history that works to actually change it. Its called "taking
action".

Open AI was a good consequence of that premise. If you are not happy about
then create another one.

------
loa_in_
If I become an author of an algorithm or trained network that can break the
world (do a Big Bad) and upload it to GitHub claiming that it can break the
world and you absolutely must not use this can I be held responsible?

------
aww_dang
Copernicus was also guilty of wrongthink.

Why is it necessary for walled gardens to enforce political standards on a
supposedly decentralized Internet?

------
dmichulke
Re: _The AI Community Needs fo Take Responsibility for Its Technology and Its
Actions_

"Fo take" is probably a typo but "to fake" doesn't really fit either ;-)

------
fxtentacle
First, the AI community would need to understand its technology itself ;)

------
nathias
never trust a psychologist with epistemology

------
theqult
The AI Community Needs to FAKE Responsibility for Its Technology and Its
Actions

