
Ask HN: Why shouldn't platforms be responsible for what they host? - jmhnilbog
The Communications Decency Act pretends platforms are somehow different from publishers. ISPs and such, sure, I see a benefit in not holding the way to get to the internet at large accountable for all the things one might encounter there.<p>Individual websites, though--Twitter, Facebook, example.com--how are they somehow different from traditional publishing platforms? If NBC decided they were going to allow anyone who felt like it broadcast anything they liked without supervision or editing, it would be shut down once the first reports of, &quot;Hey, NBC is showing me child pornography and beheading videos&quot; came in.<p>Social media&#x27;s argument appears to be, &quot;well, we just provide a free speech platform for horrible things. And after we cause damage and people complain, we take bad content down. In N years we&#x27;ll have AI that can scan things that are posted before anyone else sees them, so we can ban things secretly, which is somehow compatible with our free speech mission.&quot;<p>Meanwhile, NBC has already figured out what every hacker ought to know. Blacklists don&#x27;t work, whitelists do. Someone has to be responsible, experience whatever the content is first, and make a decision to broadcast it or not, and take the heat if they make a bad choice. It feels like tech execs are just piling ramparts of code and rare earth between themselves and any responsibility.<p>What&#x27;s wrong with a world in which someone at youtube has to approve each cat video before posting it? Where if you want to post a live video of yourself murdering muslims, you have to get your own domain hosting account?
======
throwaway8879
Are you young to pay YouTube to hire the thousands of moderators that are
going to sit and manually approve videos all day long?

There is nothing "wrong" with a video. You could argue about the morality of
the action itself, but that's a separate issue.

We as a collective have made a sort of common agreement that it's beneficial
for internet platforms to be open and free from the tangles of traditional
regulations.

Questions about should or should not don't have any meaningful use in the real
world. If you want internet platforms to be more strictly regulated, then use
your voting rights to make the changes you want to see in the world.

I don't see any sense in censoring a video of death and destruction. If you
don't want to watch it, then don't watch it. If YouTube doesn't want it on
their platform, then they should have the right to remove it. If they want to
host it, then public outrage shouldn't force them to remove it, unless they do
it out of their own accord for marketing and public relations reasons.

~~~
jmhnilbog
I'm not arguing to pay YouTube to hire thousands of moderators. I'm arguing
that YouTube should not be allowed to post content without taking
responsibility for it. If YouTube can't find a way to act responsibly, why
should it be allowed to function at all?

I am not trying to say what video should be censored. I'm saying that if you
host the thing, you should be held responsible for it. A large tech company
shouldn't be able to say "hey, we made it easy for anyone to do this thing,
heck, we even pay them to do it, but we have no responsibility for what they
do because we made it too easy for too many people to do it and, uh, we don't
want to pay people to look at all this garbage."

------
mindcrime
_If NBC decided they were going to allow anyone who felt like it broadcast
anything they liked without supervision or editing, it would be shut down once
the first reports of, "Hey, NBC is showing me child pornography and beheading
videos" came in._

Probably not, just like Youtube hasn't been shut down because some
disagreeable content showed up.

 _Meanwhile, NBC has already figured out what every hacker ought to know.
Blacklists don 't work, whitelists do._

That completely depends on your definition of "work". If "work" includes _not_
having false positives that block things that should't be blocked, or not
unnecessarily delaying the time it takes to publish something, then neither
option is terribly good.

 _What 's wrong with a world in which someone at youtube has to approve each
cat video before posting it?_

1\. It doesn't scale very well

2\. The decisions would almost certainly be wildly inconsistent, subjective,
and controversial. Which is to say, it would be, at best, a marginal
improvement over the situation today.

~~~
jmhnilbog
Gawker, a publisher, was shut down because it reported on Hulk Hogan's sexual
activity. I think we can assume that larger publishers would be shut down for
more outrageous content.

Yes, obviously, manual approval of content does not scale very well. Scaling
is not a good in itself.

Decisions will only be as "wildly inconsistent, subjective, and controversial'
as the people responsible for them. This should allow for consistent,
objective, and responsible people to rise to the top in the decision making
process, no?

