
Washington's rush to smash tech's liability protection - throwaway888abc
https://www.axios.com/washingtons-rush-to-smash-techs-liability-protection-deb16ceb-580a-4125-909a-ba786ab07edb.html
======
travisoneill1
On the one hand it seems completely reasonable that websites like HN should be
able to both display user generated content without liability, and also apply
moderation at will. On the other hand that kind of negates the distinction
between a neutral platform and publisher. Any platform could display user
generated content, but curate it via moderation to only show the opinion of
its owners, which makes it sort of equivalent to a publisher, but does an end
run around the standard rules of liability for such an entity. I really don't
know what the best way to handle this is.

~~~
adjkant
Curated user generated content and a publisher are very different and this
needs to be distinguished. This has been mixed quite a lot in the past few
weeks of events.

While it is theoretically possible to moderate to th point they are close to
each-other, this is not the reality of any of the cases being discussed. Such
draconian levels of moderation will kill any community that was not already a
full echo chamber without diversity, and not in the "oh Facebook is an echo
chamber" way but in a true, cult-like following fashion.

Affiliation vs Promotion vs Allow - Publishers have directly affiliated
authors. Algorithms and manually curated sets within UGC are promotion. Most
all other social postings are simply "allowing". These distinctions are
important, and arguably each have different legislative implications.

~~~
travisoneill1
I don't think they are as different as you say. Take the example of the NY
Times and Youtube. Both accept content from unaffiliated authors, monetize it,
and pay the authors. They both apply censorship based on the ideas expressed
by those creators (note the recent censorship of Tom Cotton by the NY Times).
Why should the NY Times be liable for what it runs but at the same time
Youtube avoids this liability?

~~~
lloda
Say YT censors after publication, NYT censors before. If you're liable either
way, then the first model is just not possible.

~~~
travisoneill1
Well that brings me back to my first point that on the face of it, the
business model seems perfectly reasonable. I brought up Tom Cotton
specifically because in that case the NYT moderated after publication like YT,
but in any case I don't see why when the moderation is applied should
determine liability or lack thereof.

~~~
gamblor956
In the real world, _when_ something happens is very important.

Applying moderation before publication is _active_ moderation, so everything
that appears is approved by the "publisher." Applying moderation _after_
publication is _reactive_ moderation, so things the publisher disagrees with
could get published.

It's like the difference between shooting someone to death and shooting a
corpse.

------
dang
There was a big thread about this yesterday:
[https://news.ycombinator.com/item?id=23550215](https://news.ycombinator.com/item?id=23550215).

I'm not seeing SNI (significant new information [1]) in this submission, so I
think we need to downweight this thread as a follow-up [2].

[1]
[https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...](https://hn.algolia.com/?dateRange=all&page=0&prefix=false&query=by%3Adang%20%22significant%20new%20information%22&sort=byDate&type=comment)

[2]
[https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...](https://hn.algolia.com/?dateRange=all&page=0&prefix=true&query=by%3Adang%20follow-
up&sort=byDate&type=comment)

------
supernova87a
Rush? Are you kidding? If anything our policymakers have been delayed, led
around with blinders on, unawares to the issues arising from tech's influence
on society, for a decade.

If you don't think that these social media tech companies have done something
fundamentally changing to our society and political process, then you've got a
head in the sand. And at some point, like corporate finance in elections, the
situation could grow to where they cannot even control it any more, and are
just pawns in a social media game.

I'm not saying that Washington policymakers will handle it right, and should
go headlong into poorly thought out action. But the idea that we have been in
a rush to do something too soon, and should wait and see some more to figure
out what's happening -- that is a joke.

------
duxup
I feel like attacks on 'the media' and so forth we'll see this just grow into
a larger ball of cynicism about everything and everyone just clings to their
given biases, ignorance, and ideology.

------
DevKoala
The silver lining here is that this could potentially push companies to shift
user generated content to data models that protect privacy in order to absolve
the platform provider from any responsibility. Signal for example has avoided
information requests thanks to their architecture that respects privacy.

------
m0zg
I find it remarkable that HN readers can be simultaneously in favor of Net
Neutrality and in favor of censorship. How can these two opinions coexist? The
flipside is also true: I find it remarkable that conservatives are _against_
Net Neutrality and censorship at the same time.

Whereas the right answer looks pretty obvious to me: I'm for Net Neutrality
and against censorship.

~~~
centimeter
Net neutrality is itself a (soft) form of censorship, in that you're taking
away a private party's ability to make pricing decisions on information
transfer. That said, I find it much less objectionable than what e.g. Facebook
and Google are doing - mostly it just has negative economic consequences.

~~~
m0zg
> taking away a private party's ability to make pricing decisions on
> information transfer

Seems like it's a soft form of anti-censorship then.

------
orev
It’s very clear what is going on here, and it’s concerning that none of these
articles actually point it out. None of this is meant to go through. It
doesn’t matter if it will be struck down on First Amendment grounds, or if it
will eventually lead to removal of more content. What does matter is how much
money it will take to launch legal defenses, that the process will drag
companies through the mud, and that it will not be over until after the
election.

The timing of when this has started is not a coincidence. There are enough
months left before the election to make a lot of noise and scare the
companies, but not enough time for it to actually be resolved. During this
time, they will be able to drum up a support on how bad “these liberals” are
trying to censor conservative voices. All of it is a show to whip up The Base
to come out to vote and “save our freedom”.

These moves are meant as a warning to companies that the government plans to
tie them up in legal proceedings for years if they don’t fall in line.

Anyone making arguments about the logical end results (if these things go
through) has missed the point.

~~~
Wistar
Sort of a SLAPP suit?

------
MintelIE
Welp, looks like if you decide to edit what people post you become a
publisher. Just like it says in Section 230.

------
liaukovv
Big tech gave up moral right for that protection as soon as they decided to
interfere with what is posted. Its like claiming that you don't generate any
of the content, because you "just" select what to ban from an endless stream
of randomly generated content.

~~~
graeme
Moderation was explicitly considered as part of the reason for section 230.
What are you basing your opinion on?

~~~
liaukovv
How many people on twitter were banned for "eat the rich"?

~~~
schoen
The parent's point is not that tech companies are unbiased or shouldn't be
criticized for their bias, it's that §230 _was originally adopted specifically
to allow subjective moderation_. It's a misconception that the law was
predicated on a neutrality or objectivity obligation. Instead, it was meant to
_reverse_ court decisions that suggested that such an obligation might exist
as a requirement for protection of intermediaries. §230 was meant to _allow_
intermediaries to be protected while exercising editorial control.

That doesn't mean that platforms' actual editorial control is "fair"; it just
means that the argument that platforms should obviously have to lose §230
protections over their editorial biases misrepresents the history of this
legislation.

[https://en.wikipedia.org/wiki/Section_230_of_the_Communicati...](https://en.wikipedia.org/wiki/Section_230_of_the_Communications_Decency_Act#Background_and_passage)

~~~
liaukovv
Shuting out half of a political spectrum doesnt strike me as the original
intention.

The idea was "to keep internet civil" which makes a lot of sense, but that's
obviously not what is happening and not what provoked this reaction.

~~~
jbermudes
On the contrary, that is exactly why it was created. Dog lovers forums aren't
supposed to be compelled to allow cat lovers a platform to voice their opinion
on the dog lover's private server. Private servers are not public forums in
the legal sense. We can have a conversation on whether or not that needs to
change, but basic US law on private property is very clear about this.

~~~
liaukovv
Law follows morals, so first we need to determine what we actually want.

Dog lovers forum is not at issue here because it doesn't matter to how society
functions, giant twitters and facebooks on the other hand very much matter and
should be re-evaluated.

~~~
jbermudes
Large websites, large skyscrapers. Still private property.

