
Censorship is bad even when it’s done by private companies - barry-cotter
https://necpluribusimpar.net/censorship-is-bad-even-when-its-done-by-private-companies/
======
jvickers
I have a wide view of what 'censorship' is, and I include spam filtering in
the definition. For example, a while ago I saw an advert for some kind of
erection-causing pill on a forum discussing C++. If the forum moderators
remove such a post, they make the forum more efficient by saving other users
(who are looking for C++ content) from themselves having to filter out
irrelevant information / a sales pitch.

I also see far too many 'work from home' adverts in discussions in the
Independent newspaper's comment section (on unrelated articles), and if they
took a more censorious approach to such comments then the comment section
would improve.

To an extent I support censorship, and according to what I believe censorship
to be, almost everyone else supports it.

If this very comment had been about some unrelated topic, such as giving an
opinion about who to blame or not blame for problems in the Middle East, it
would be right for the comment to be censored from this discussion, and it
would be censorship from a private company, censorship of a political
viewpoint no less.

~~~
mc32
Here’s the thing. On dedicated forums we know the protocol. We don’t want spam
and we don’t want politics.

If it’s a political forum we expect all civil exchanges to be treated equally.
So a proponent for Owls and a proponent for sawmills both get their say and
one doesn’t get “deranked” or de-monetized because it’s not the popular
opinion or the au currant opinion. We don’t expect one political candidate to
be artificially ranked and another artificially buried in the results.

Now, if I’m on the Hillary blog, yes, of course I expect the org to manage the
commentary to fit their narrative. I don’t expect Zuckerberg or Pichai to turn
their orgs into the Hillary blog or the Donald blog.

~~~
crooked-v
> If it’s a political forum we expect all civil exchanges to be treated
> equally.

The problem here is that bad actors intentionally take advantage of this by
supplying an endless stream of 'civil' arguments for entirely abhorrent stuff.
This includes usually feigning ignorance and claiming they're 'just asking
questions'† when objections are raised, even though it's the tenth or fiftieth
or five hundredth time the same thing has come up.

†
[https://rationalwiki.org/wiki/Just_asking_questions](https://rationalwiki.org/wiki/Just_asking_questions)

~~~
archgoon
Given that there will always be an influx of new people; and that most people
will not be familiar with previous discussions, I'm not convinced that most
forums are being assaulted by bad actors. This seems to be more of a Eternal
September problem.

Of course bad actors can abuse this; though I've always felt it would be good
for the derailing comments to be removed with a polite dm message explaining
that the topic had been discussed previously with links to said discussion.

~~~
crooked-v
It's definitely an organized tactic among some groups. For example, there's a
literal neo-Nazi handbook† that advises members of that group to disguise
their sentiments in civility and/or 'jokes' in order to sneak it into
mainstream discussion.

† [https://www.theguardian.com/commentisfree/2017/dec/19/neo-
na...](https://www.theguardian.com/commentisfree/2017/dec/19/neo-nazis-hatred-
comedy-racist-daily-stormer)

~~~
mc32
Any movement can use those tactics. I wouldn’t be surprised if they also read
rules for radicals too. Any group looking for influence is going to use tried
and true methods.

So while many vile groups like the nazis and others seeking power, there are
many other groups who hold unpopular opinions even unpopular and illegal
outcomes (as presently held by the public), I’m not sure we want to suppress
that. Much of what we have today as acceptable discourse and so on is because
we allowed those voices which were considered degenerate or unacceptable one
way or another.

We don’t need a new dogma telling us the way to think correct.

------
Ajedi32
> For instance, if you’re a journalist and you want to promote your work,
> there is simply no viable alternative to Twitter at the moment. No other
> microblogging platform comes even close to having the number of users
> Twitter does, and most of the alternatives are hotbeds of extremism, so that
> anyone who joins them is thereby disqualified in polite company.

IMO if you can't even _try_ to use an alternative microblogging platform for
fear of being labeled an extremist by "polite company", that's indicative of a
much more serious societal issue.

The quote included in this article from John Stuart Mill's _On Liberty_
explains the danger rather eloquently I think:

> Society can and does execute its own mandates: and if it issues wrong
> mandates instead of right, or any mandates at all in things with which it
> ought not to meddle, it practises a social tyranny more formidable than many
> kinds of political oppression, since, though not usually upheld by such
> extreme penalties, it leaves fewer means of escape, penetrating much more
> deeply into the details of life, and enslaving the soul itself.

~~~
driverdan
> IMO if you can't even try to use an alternative microblogging platform for
> fear of being labeled an extremist by "polite company"

I downvoted you because this is a ridiculous claim. If you use a platform like
Gab that embraces bigots and extremists perhaps, but that's not the only other
platform.

~~~
swebs
What are some other platforms?

~~~
MrEldritch
Mastodon comes to mind.

~~~
chipsa
Gab apparently runs on Mastodon now.

------
nullc
The problem that arises when you define censorship broadly is that the
opposite of censorship is censorship.

If no one can restrict what gets published in their venues then as soon as a
venue starts discussing stuff I don't like, I can spin up a shill call center
or a guy with a thousand accounts and a copy of GPT2 and flood that venue with
so much divisive moronicism that the voices that threaten me will be almost
completely drowned out.

Worse, large open venues being flooded with low quality communications already
happens even without any intentional censorship-by-flooding attack. ...
providing great cover for these attacks. Especially savvy attackers can
leverage the pre-existing populations of well meaning fools by shaping their
messages to motivate fools into carrying on the disruption on their own.

If you think about it-- the classical view of censorship where someone
outright silences you is nearly impossible online today in most of the world.
It is extraordinarily hard to completely stop the spread of information that
people want to spread. But at the same time, censorship that works by flooding
out an idea or discrediting it by association with abusive nutballs has never
been easier or more effective.

When we worry too much about private parties shutting down conversations in
their own venues we risk improving the situation around an outmoded and
somewhat ineffectual model of censorship at the expense of making a modern and
highly effective model much worse.

The best I think we can do is foster an internet structure where everyone can
have their own venues which they can operate under whatever rules they think
are best, and everyone is free to move among them at the lowest cost possible.
That way, effective moderation can shut down flooding attacks and voting-with-
your-feet can shut down overly censorious (or overly passive!) moderation.

Unfortunately, the highly centralized world created by the popularity and
network effects of sites like facebook, twitter, and youtube are the opposite
of this. Instead of people being empowered to self-regulate abuse and
migration being easy, people are disempowered and migration costs are high.

~~~
leereeves
This is a rather common style of argument often used against civil rights:
every person's civil rights should be restrained because criminals abuse those
rights.

We can't have censorship free platforms because someone might spin up a
thousand accounts. We need the NSA to monitor all communications because
terrorists use the Internet. We should arrest 12-year olds for making finger
guns because of school shooters.

But in this case as in so many, it's an off-topic argument, merely a
distraction or an excuse for overreaching authority. No one here is arguing
that someone who makes a thousand accounts shouldn't be restrained; that's
simply not the topic under discussion, even if the word "censorship" can be
stretched to include that topic.

~~~
nullc
Please. This isn't some hypothetical that I'm suggesting.

There are largely uncensored forums, like 4chan. That's actually what you get
when you go there rather than having something that is heavily moderated but
only in ways that you agree with, where you then pretend it isn't moderated.

And in fact, I'll fully support you having your own platform which is as
uncensored as you want it to be. I wish you the best of luck.

What I don't support is you arguing that other people can't have their own
platforms which restrict publication on their own platform however they see
fit, including ways that you or I might disagree with.

People arguing that private parties can't limit the material posted on their
own property are absolutely taking a position which is contrary to free
speech. The power to exclude is just as important, if not more important, than
the power to include.

If you'd like to argue that the dominance of a few platforms violates my
assumptions and changes the tradeoffs, I'm not sure I'd disagree. But I think
we should instead worry about fixing the monopoly problems rather than
limiting private parties ability to exclude speech they disagree with on their
own properties.

~~~
leereeves
I think we should make a distinction between platforms and publishers. A
publisher curates content according to their opinion, to the extent that when
they publish outside opinions, they add a disclaimer like "Opinions expressed
herein do not represent those of the publisher". There's no such disclaimer on
reddit, Twitter, or Facebook because people understand that the posts aren't
coming from the platform. These sites aren't a place for the owners to express
their opinion; they're a mechanism for the users to communicate.

And when the mechanisms of communication are privately owned, freedom of
speech can't exist if the owners censor those mechanisms.

Platforms (as opposed to publishers) should be treated like telephone
companies or the mail or ISPs. They shouldn't even read the messages they
carry, let alone censor them.

On the other hand, sites aren't required to be a platform, but if they choose
to act like a publisher then they should be held responsible for the content
they publish.

------
Miner49er
So the author wants to protect "the marketplace of ideas" from monopolies
caused by a different market. However, they don't want the government to do
it. They also don't seem to think it's possible for smaller companies to
challenge these monopolies.

What then is the author's solution? One is never given.

~~~
gonational
In my opinion, the solution is simple:

If any “platform” censors beyond removing illegal or copyrighted content or
spam, they should have their “platform” status revoked, and they should be
considered a publisher.

This prevents such companies from using “platform” status as a cost saving
mechanism to help them become large monopolies. Companies like Facebook would
not have been able to take over so much of the market if they had to spend
hundreds of millions of dollars on curating content to prevent lawsuits, etc.

I also don’t think of this as some kind of punishment. It’s perfectly
reasonable to want to set up some kind of publishing company that has user-
generated content; it’s just not reasonable to expect such a company to grow
to such a global scale as Twitter or Facebook.

Edit: added “spam” to reasons

~~~
pavel_lishin
> _If any “platform” censors beyond removing illegal or copyrighted content,
> they should have their “platform” status revoked, and they should be
> considered a publisher._

What about spam?

And would this extend to marketplaces? If a company allows users to sell
educational resources for kids, what would happen if they start removing items
that clearly don't fit?

~~~
buboard
They could have a checkbox "remove spam" that the moderators could turn on and
off. There's not a lot of ambiguity about spam so it's not a real issue. I
think the law should be "platforms should not tinker with their users'
expressed preferences - you can't hide something that they 've chosen to
subscribe to".

> And would this extend to marketplaces?

I believe these are already considered publishers ? In any case it's no
different than how traditional bookstores work

~~~
danShumway
> There's not a lot of ambiguity about spam so it's not a real issue.

People say this, I don't think it's true.

I think there is ambiguity around when marketing crosses into spam. If you
look at different communities, even HN and Reddit, you'll find different
opinions about what kind of promotion crosses that line.

Individual communities need to be able to make their own decisions on that,
they don't need blanket rules applied across the board to all platforms.

~~~
buboard
reddit has already a "spam level" slider. They should allow subreddits to turn
off spam filtering completely - although of course it doesnt make sense. But
if they want to be classified as a simple carrier they would have to .

~~~
danShumway
In your definition of censorship, would it be OK for Facebook to hide some
political articles behind a "filter dishonest posts" setting that was on by
default for users and groups?

Is it OK to do that to a marketing post?

~~~
buboard
I think facebook should be required to have controls to turn off their various
filterings. And they should be held liable if someone can prove they
systematic hide certain views from searches. That would be enough for me
(having them on by default is also OK).

(Incidentally i started using twitter in "view latest" mode and i prefer it -
turns out their expensive algorithms were pretty useless for me)

------
imgabe
Instead of an ideological bias, what if it is just a demographic bias?

The author supposes that there are some mustache-twirling masterminds behind
the scenes at Google, Twitter, and Facebook, picking and choosing what content
is promoted. But the algorithms, at their most basic sense, promote what is
popular.

Based on voting records, in absolute numbers, there are simply more left-
leaning people (at least in the US) than there are right-leaning people.
Gerrymandering and the electoral college, etc don't reflect this in the
government, but it is the case.

The left-leaning viewpoints are more promoted then, simply because there are
more people who want to see them. There's no reason to posit a nefarious
conspiracy.

We could argue that it would be beneficial to promote less popular viewpoints
more to expose people to new ideas. It probably would be a good thing. But
these companies are in the business of making money and people generally just
want to see things that confirm their own biases. Just look at how upset
conservatives are that they can't find enough content like that ;)

~~~
buzzkillington
>The author supposes that there are some mustache-twirling masterminds behind
the scenes at Google, Twitter, and Facebook, picking and choosing what content
is promoted. But the algorithms, at there most basic sense, promote what is
popular.

Hardly. The algos have programmers thumbs on them heavily. A hilarious example
was when Reddit was trying to deal with the_donald successfully spamming
reddits hot algorithm. But someone screwed up the new weighting and instead of
pushing td posts down it pushed them up, the whole front page was nothing but
td posts for a day. They eventually fixed it, but never think that there isn't
a human jury rigged mess of if else statements at the core of the algorithms.

~~~
imgabe
That just proves my point. the_donald had to find an exploit and resort to
spamming in order to get the algorithm to push them onto the front page. They
couldn't get there on the basis of, you know, actually being popular because a
large number of people want to read them.

~~~
buzzkillington
>That just proves my point

It proves that reddit admins didn't like right wing shit posting and were
willing to rewrite the algo to deal with it. Which is completely the opposite
of the point you were making that these algos are somehow neutral.

~~~
imgabe
It proves that reddit admins take steps to deal with people who abuse the site
with spam, which is what one would expect of any forum administrator for any
spam content.

If their content was so compelling, why did they have to spam it? Why wouldn't
it just attract users organically the way other content does?

~~~
buzzkillington
>Why wouldn't it just attract users organically the way other content does?

[[citation needed]] Would you like to present proof that t_d did not grow
organically in 2015/2016?

So far you're heavy on claims and light on proof.

~~~
imgabe
Citation? No problem. Here it is:

> A hilarious example was when Reddit was trying to deal with the_donald
> successfully spamming reddits hot algorithm.

\- buzzkillington in Hacker News comment:
[https://news.ycombinator.com/item?id=21463480](https://news.ycombinator.com/item?id=21463480)

You were the one who stated that /r/the_donald spammed Reddit to reach the
front page. Are you changing your story and now saying they reached the front
page organically? Are you confused about what spamming is?

If there really were a large number of people genuinely interested in
the_donald, it would have reached the front page without spamming. The reddit
algorithm is probably also biased against posts trying to sell penis
enlargement pills. Do you think that is also oppressive?

~~~
buzzkillington
If you haven't already, would you mind reading about HN's approach to comments
and site guidelines?

>Be kind. Don't be snarky. Comments should get more thoughtful and
substantive, not less, as a topic gets more divisive.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
woopwoop
Replace "censoring conservative viewpoints" with "censoring the nice people
who tell me about how I can make $13,000 a week working from home". Which part
of this argument changes? I can't find any part that does.

~~~
jerf
I can't speak for the article, but I can speak for myself. The difference is
that I, personally, have control over the blocking of the "make money fast"
communication. I think part of the right to free speech is the right to be
able to choose what speech you consume, i.e., everyone has the right to free
speech but nobody has the right to _impose_ upon you.

For me personally, since I run my own incoming mail server, there is no
platform censoring those emails. I use my own tools, trained on my own emails
and my own choices about what to block, to do the filtering.

When a platform entirely blocks some speech, that is not my choice; it is
theirs. That is the critical difference.

This is still not a bright shining line, because there isn't one. But it's
sufficiently clear to be a guiding principle. If you _want_ to lock yourself
in some particular filter bubble, that's your choice, but nobody else has the
right to _force_ you into one.

~~~
klyrs
> The difference is that I, personally, have control over the blocking of the
> "make money fast" communication

Have you never encountered a phpBB forum, or a wiki, or a social network, that
doesn't have spam blockers? In my experience, such sites are inundated with
spam, occasionally to the point that signal is hard to find over the noise.
Your "solution" only touches on email. Filtering spam from such diverse
sources would require a much more intrusive technology. Also, you're putting a
wholly unreasonable burden on website owners to preserve abusive content -- if
somebody uploads a flood of junk data, are you legally required to retain and
redistribute it all? And are you forcing users to download gigabytes of
garbage, and filter it down to a kilobyte of content?

~~~
jerf
As a pragmatist on this matter, I believe the solution here is scale. The
control I have over a phpBB forum is that if I don't like their spam policy, I
can leave, and either join another or form my own.

We're not talking about a phpBB forum or a little blog with a few thousand
commenters. We're talking about single companies with very significant
fractions of the total discourse flowing through them. If Facebook blocks
something, it affects a massive number of people, and "just go make your own
Facebook" is not a suitable recourse. Scale matters. Small entities can do
things without being a threat to the body politic that large entities can not.

As I've said before, I'm not entirely convinced that something the size of
Facebook can actually work in the long term [1]. Trying to jam everybody into
one set of rules may simply be infeasible, for reasons not entirely related to
"rights". If you look at the current conflicts over Facebook through this
window, you may find it makes sense over the conflict. Especially if you are
one of the majority here on HN who are kinda lost over the idea of why so many
have become anti-Facebook, when you just see Facebook imposing your perfectly
sensible, obviously correct values on the world, why is everyone complaining?
Well, it's that whole diversity thing you may have heard rumors about. It's
real. It may not be the case that you can get everyone into one community with
one set of standards, regardless of how "obviously right" those standards are.

[1]:
[https://news.ycombinator.com/item?id=20146868](https://news.ycombinator.com/item?id=20146868)
\- in fact, I'd amplify the original post I made by also pointing out Facebook
is _international_ , so we're not just trying to stick the diverse cultures of
the US in there, but trying to stick all the diverse cultures of the _world_
in one big pile. When I put it that way, does the idea that such a thing could
work in the long term even pass the smell test?

~~~
klyrs
> We're not talking about a phpBB forum or a little blog with a few thousand
> commenters.

I listed examples including "wikis" and "social media sites" which
categorically contain the largest websites in the world. You'll find websites
of all orders of magnitude from tens to millions of users -- where do you draw
that line? But more importantly, are you under the impression that large-scale
sites are somehow immune spammers?

> As a pragmatist on this matter, I believe the solution here is scale. The
> control I have over a phpBB forum is that if I don't like their spam policy,
> I can leave, and either join another or form my own.

I'm not understanding something here. You think that it's okay for spammers to
implement a "heckler's veto" by de-facto shutting down every single niche
website that (by law, presumably) accepts/rebroadcasts any and all submitted
content -- that you'll be able to "vote with your feet" and move/create a site
or forum that will somehow be immune to this onslaught? At what point is
"scale" the answer? Bigger is better?

> As I've said before, I'm not entirely convinced that something the size of
> Facebook can actually work in the long term [1].

I'm suffering whiplash. You gotta help me here. You're happy to feed small
websites to the spamwolves, yes? But you're not convinced that big websites
are good either. Do we just cancel the entire internet?

I'll grant that I'd probably get more work done.

~~~
repolfx
It's easy to write laws such that totally blocking automation is allowed,
totally blocking human written content is 'censorship', and
categorising/hiding by default/ranking content is allowed.

Yes, this doesn't deal with things like "is YouTube ranking biased" but it's a
start. You posit that sites would be obliged to publish onslaughts, but many
spam filters are pretty good at filtering such attacks. For instance that's
where CAPTCHAs came from (there are better technical solutions than CAPTCHAs
but you get the idea).

------
leshow
The article is good, but it could do with a lot less ad hominem and calling
people 'idiots'.

You can do better than this:

> you may think that you’re very sophisticated, but actually you are just
> being ridiculous.

> Sure, you can say that if you want (though I don’t think that’s how the word
> is actually used by ordinary people), but only idiots think it shows that
> it’s not a problem.

> I know that some people, especially among libertarians, think it makes them
> smart to believe that, but they clearly have not thought this through.

------
rhaksw
I'm always surprised when articles like this don't mention reddit. Isn't it a
top 10 website that was once top 3? I've read quotes saying one in three
americans has visited reddit. Anyway, reddit is a great resource for studying
moderation activity.

~~~
Nasrudith
That is because it oozes pretense of refusing to accept that their "silent
majority" is really a noisy minority. Not recognizing freedom of association
is one of the big warning signs along with not realizing the impact of what
they want on /themselves/. Personally at that point I mentally mark them as
spam.

Annoyingly they conflate actual corporate censorship concerns with their
persecution complex. DMCA abuse is actual corporate censorship for instance.
Same if say Comcast blocked Netflix. The government using a cudgel to make
corporations do their censorship when laws would be struck down in an instant
is censorship and a messy semantics arguement of corporate vs government
origin. These complaints boil down to "they aren't giving me an audience how
dare they!". There is a world of difference between not given any attention
and cannot publish.

~~~
rhaksw
I don't see how this relates to leaving reddit out of articles about
moderation in social media.

------
tssva
The article starts with an unsupported premise that the platforms these
companies operate favor liberal over conservative speech so my natural
inclination is just to dismiss the rest of it; however, even if true the
solution to improving freedom of speech is not suppressing someone else's
freedom of speech.

~~~
weci2i
I probably can’t supply data that would convince you liberal speech is favored
over conservative speech. However, within the past month my personal Facebook
account was curiously disabled within an hour of typing the word “Christian”
in a comment. I would not identify as conservative or liberal. I certainly
can’t prove that merely using this word was what triggered the permaban, but
Facebook absolutely would not reinstate my account, even after I supplied a
valid driver’s license as proof of my identity.

------
jon_akimbo
It's hard to take this argument seriously when the only companies mentioned
are left-leaning tech companies such as Twitter, Facebook, and Google.

I suspect that the author's concern isn't censorship per se, but censorship of
viewpoints he happens to uphold. If Twitter's suppression of conservative
content is bad, then so is The Federalist's suppression of liberal content.

You're about to argue that one of these is a social media platform, while the
other is a news outlet. In that case, what do you think is the essential
difference that makes it okay for one to censor certain ideas but not the
other?

~~~
lolinder
Isn't it more likely that the author chose Facebook, Google, and Twitter not
because they're left-leaning, but because their traffic is vastly more
significant than The Federalist? The three are all top-50 sites globally. The
Federalist doesn't even make it into the top 10k.

It's pretty rational to focus more on the sites that have the most traffic
when discussing bias, isn't it?

~~~
jon_akimbo
> It's pretty rational to focus more on the sites that have the most traffic
> when discussing bias, isn't it?

I feel that if a person is making an argument from principle, then that
principle should apply everywhere and not just where politically convenient to
them.

~~~
lolinder
The author does:

> ... although it can be leveraged against big tech companies that are biased
> in favor of liberals, this line of argument also has implications that
> conservatives may less readily welcome. For instance, it means that how rich
> people use their money to promote their ideas may also be a problem, insofar
> as it distorts the marketplace of ideas.

The principle he's arguing is that when an organization that has a de facto
monopoly on a type of information stream (search, microblogging, and social
circle media) uses that monopoly to favor one political viewpoint, that is
harmful to society because it "distorts the marketplace of ideas" by
preventing equal access to said marketplace.

The Federalist is not in this position. For every conservative magazine like
The Federalist, there's a liberal one like Mother Jones. This is the
"marketplace of ideas" that the author refers to in action, not a distortion
of it. So the author is, in fact, arguing from principle, just not the
principle you're using.

------
llcoolv
How could a private company impose censhorship when none of them are allowed
to use violence? They could only ban certain types of behaviour on their
premises, right? Which is not really censorship as one could just go to
someone else's premises or found their own.

~~~
crooked-v
The real problem here isn't that companies are allowed to control what content
goes on their platforms, but that companies are allowed to get so large that
they're a de facto monopoly for much of the world (e.g. Facebook).

------
ZeroGravitas
Regulating private companies for the benefit of society? I didn't think that
idea was still within the current Overton window at this point.

The phrase "hoist by their own petard" seems appropriate here.

------
dragonwriter
> For instance, if you’re a journalist and you want to promote your work,
> there is simply no viable alternative to Twitter at the moment. No other
> microblogging platform

Microblogging platforms aren't the only venue for journalists to promote their
work.

But, even if that wasn't the case, that's not an argument that private
censorship is bad but that private monopolies are bad, and that censorship by
monopolies on essential communications mechanisms (public or private) is bad.

------
boomlinde
The problem I think is that we let these companies grow so big, and let
society grow so dependent on them that their policies become a worldwide
public issue. If we become dependent on Twitter or Facebook to participate in
democracy, regulation should follow.

------
adrianN
There is outright censorship, or rather removing topics that the owner of the
platform doesn't like, and then there is the much more serious problem of
creating echo chambers for the users where dissenting opinions are not
deleted, but hidden.

------
Starkus
Censorship is awful, sometimes even worse through large tech corporations,
i.e. Google, Facebook, etc.

------
jacquesm
Politicized flamebait, flagged. Call it censorship if you want.

------
nova22033
_For several years, conservatives have expressed concern that big tech
companies, such as Google, Facebook and Twitter, are suppressing right-leaning
content in various, more or less direct ways. Even though conservatives often
exaggerate it, I think there is no doubt about the anti-conservative bias of
these companies, but that is not what I want to discuss in this article._

You don't want to discuss it here because you don't have a detailed analysis
backed up by solid data. If you did, you'd lead with it.

Trust me, most people here can handle an article full of graphs and stuff.

------
RickJWagner
I've been flabbergasted at the number of _universities_ that have recently
stifled speakers.

I agree with much of what the author wrote. It's time for change. Classic
liberalism may be best.

------
d--b
See but that's a libertarian issue here.

If you value freedom above all else, Google being an independent entity,
should have the freedom to ban / veto whomever they want.

You can't have it both ways: let people be free to say whatever they want AND
force the owners of communication channels to spread all information.

~~~
ChaosDegenerate
Freedom comes with responsibility. The idea that they are not responsible for
the content (i.e. you can't hold Youtube/Google legally responsible for the
content they have on their platforms) while at the same time they make profit
on the very same content and can censor it as they see fit -- this is idiocy
in its purest form.

We hold newspapers or TV station legally liable for the content they serve.
This is fair. Situation in which they couldn't be sued for the stuff they air
still making money on it -- that's the current situation with them. It's even
worse because they can somehow censor the content they don't like for their
own political reasons.

It's like with banks. Yes, we can make money. And yes when times are bad we
can get money from the government/tax payers too. This is not freedom this is
fascism.

~~~
Nasrudith
That definition of responsibility negates freedom in itself. I have literally
seen it used by dictators who set libel laws to crush any negative stories.

The stance is absurd - tt is akin to saying that Kinkos should be liable for
anything printed beyond "they knowingly used ink laced with weaponized
anthrax" \- and should be charged with terrorism because a customer who
quietly printed a bomb threat formatted to look like a resume - just because
they kick you out for openly printing Goatse.ck and every other old shock site
which scares away the other customers.

~~~
ChaosDegenerate
Exactly my point. But for this to work, they need to be regulated by
Government as utilities.

No one has the right to turn off electric grid on your house because they are
"private company". No one has the right to turn off "running water" from your
sink because they are "private company" and if you don't like it, you can go
somewhere else or open your own. The same way, no one has right to censor me
because they are "private company" and if I don't like it I can go ahead and
open my own Youtube. They should be and will be regulated as utilities.

~~~
danso
What precedent are you using to imagine that if Facebook/Youtube/etc were to
become public utilities, that they would be barred from censorship? Public
broadcasters are currently regulated by the FEC:

[https://www.fcc.gov/media/radio/public-and-
broadcasting#OBSC...](https://www.fcc.gov/media/radio/public-and-
broadcasting#OBSCENE)

> _Indecent Material. Indecent material is protected by the First Amendment,
> so its broadcast cannot constitutionally be prohibited at all times.
> However, the courts have upheld Congress ' prohibition of the broadcast of
> indecent material during times of the day when there is a reasonable risk
> that children may be in the audience, which the Commission has determined to
> be between the hours of 6 a.m. and 10 p.m. Indecent programming is defined
> as “language or material that, in context, depicts or describes, in terms
> patently offensive as measured by contemporary community standards for the
> broadcast medium, sexual or excretory organs or activities.” Broadcasts that
> fall within this definition and are aired between 6 a.m. and 10 p.m. may be
> subject to enforcement action by the FCC._

------
JediWing
Know what else distorts the marketplace of ideas?

Money.

Money allows for buying advertisements, essentially allowing you to buy a
platform to spread your ideas.

Having money typically has a pretty clear bias: protecting the money of those
who have it. This comes at the expense, many times, of those who do not.

That typically equates to conservative policies.

I'm not convinced that the market distortion caused by supposed bias in social
media moderation is even close to the distortion created by allowing money in
politics. In fact in many (not all) cases of purported bias, the true impetus
for action was due to hateful or otherwise harmful speech.

Nor do I agree that if we disagree with state censorship, we should disagree
with Facebook censorship.

Being thrown into the legal system with threat of monetary penalty, jailtime,
etc. is worthy of much more scrutiny than speech that gets you banned from a
social network.

~~~
FussyZeus
This was a big part of why I enjoyed Jack Dorsey's comments on Twitter
removing political advertising.

The Internet was supposed to be a democratizing force, and it was, for awhile.
To an extent it still is, but as the convergence of content continues to pull
people into the various Platforms™, all of which are supported by advertiser
dollars, the content itself is diluted. Some not so much, some so much it's
not recognizable, others so much that it's no longer there.

YouTube is a great example of this. Once a monument to the power of
democratized content, it's slowly but surely pushing out real creators in
favor of advertising companies producing advertiser-friendly and oh-so-
clickable content, much of which is complete and utter trash. It's not just
marketing companies though, a new breed of creator has spawned since roughly
2010, determined to soak as many advertising dollars out of the platform as
they can with as much vapid and soulless content as it will suffer.

And all of this is sponsored in turn by equally vapid and soulless corporate
advertisers, determined to make sure their precious brands are not associated
with anything as uncouth as an opinion or a real person.

Hell, there's a great argument to be made of how the "pivot to video" and the
resulting catacombs full of gutted and destroyed media websites can be laid at
the feet of corporate marketers who wanted video ads. Nay it was not any
Internet citizen who claimed the future was video, we didn't ask for it, and
we certainly didn't like it.

The whole thing is gross, and far from isolated to YouTube, it has plagued
them all, save perhaps for Twitter and I think the only thing saving Twitter
is that it's just not as easily monetized. But, Twitter has a different
problem, which I believe Jack is trying to solve; the influx of automated
bots, bought and paid for by other marketers, more often trying to sell
politics than products. But it's just a different kind of feces from the same
anus and indicative of the same core problem: advertisers suck, at everything.
And they have more power than ever on the Internet, and so the Internet sucks
more than ever.

~~~
AnthonyMouse
We need to think about why advertising is winning.

A big reason is that advertising is a tax deduction. If I fund my content with
advertising, the advertiser gets to deduct the money they spend from their
income tax. If I sell content to users, the users don't get to do that.
Between the employer and employee halves of FICA, federal withholding and
state income tax, that's typically more than half the money. To make it even
we would either need a tax deduction for purchasing content or to get rid of
the one for buying advertising.

Then you have companies like Apple taking a 30% cut of the creator's revenue.
The customer has to pay $1.43 in order for the creator to get $1. That's
obviously making direct purchases less competitive against advertising -- and
it multiplies with the tax treatment. If you're paying 50% in tax and then 30%
to Apple, the customer needs to earn ~$2.86 for every $1 they want the creator
to get. If two content creators start with $1 and use it buy each other's
content, by the time the money goes to the other person and back one time,
what's left is only ~$0.12.

It would be a lot easier to _sell_ content if the creators actually got the
lion's share of the customer's money.

------
root_axis
Private companies should be able to censor political speech, period. If it's
my desire to build a platform that is favorable to certain types of speech,
that's my right and the government shouldn't be forcing me to modify my
platform to rebroadcast speech I disagree with.

In terms of conservative complaints about leftist censorship, I have to simply
roll my eyes. The conservatives complaining about big tech don't care about
censorship, they only care about the dominance of their own political
messages. This is obvious when you ask them about "democratizing speech" on
platforms they already dominate like cable news and talk radio. Remember the
fairness doctrine? They hated the idea of having the government dictate the
political composition of the airwaves, but when it comes to big tech, suddenly
they want to nationalize the platforms to their own benefit. I find the effort
extremely disingenuous.

Also, something something downvotes something something censorship!!

~~~
Grumbledour
I think you are right. But I would also argue that such censorship should be
as transparent as possible.

Though the real question is, at a certain size of user base, does a private
space not become a public one? Or rather, if there is no alternative of not
having censorship, no place else to go, does the censorship not become a
problem then?

I am not sure what the solution here would be, actually, but while I think if
people don't like facebooks/twitters censorship they should just go somewhere
else, I would not hold the same view for example on email providers.

~~~
root_axis
I would be willing to pay taxes to fund a government run online "public
square" that could not be legally censored, but I am fundamentally opposed to
taking control of privately owned websites and forcing the owners to host
content that they disagree with, especially political content. I disagree with
the idea that the number of people visiting a website is an important metric
in determining the owner's property rights.

> _at a certain size of user base, does a private space not become a public
> one_

I think that's a question worth exploring, but I think any honest discussion
of this idea has to acknowledge that most of these social media companies are
essentially trivial entertainment sites and that if we are really trying to
designate access to twitter as a basic human right we must first consider why
life saving drugs, housing, and healthcare are not human rights.

------
bkor
Not that impressed with this article. It states as it might be a fact that the
"right is suppressed" and "left" is "in favor". Then it never really explains
why censorship might be bad or not. It doesn't get really concrete, just
something vague about "marketplace of ideas".

IMO it's bad to have places where only one side is heard, the other is not.
Further, there are too many places where people might feel they're a majority
in their opinion, while they're not. Usually people would notice they're the
odd one out. That's not the case in various places (e.g. subreddits).

Note that I'm from the Netherlands and do not agree with freedom of speech
above everything is a good thing. There should be some limits to it. These
reasonable limits should be in place.

Making distinctions between right/left and complaining that more of your stuff
is questionable.. yeah.. so? The "left"/Democratic party I consider pretty
conservative and not looking out for its people. As such, it's always strange
to see the complaints, what is right/left for e.g. US to me is "right wing" vs
"even more right wing".

I also do not see the restrictions; there should be way more limits on
reddit/twitter/facebook. Further, I think the companies should be held
responsible for this.

------
lonelappde
This is such an own-goal. Before worrying about censoring unwanted content,
companies should spend their time on stopping _promoting_ it (and building
client side filtering controls) which is just as huge a problem and not anti-
American to fix.

The problem is that companies _want_ the flamebait and toxic content to drive
enragement engagement ad impressions, but don't want to be accountable for the
stuff that's just too edgy for advertisers.

~~~
jcriddle4
"Companies want the flamebait...." The moderation system on reddit seems to
heavily promote group think. Actually promoting controversial comments as a
default would be interesting.

~~~
nova22033
* Actually promoting controversial comments as a default would be interesting.*

Controversial is subjective. Would you like FB/reddit to promote holocaust
denial?

~~~
jcriddle4
Reddit has an option to sort on controversial on a group and also on comments.
I suspect the algorithm is probably looking for things with a combination of
upvotes and downvotes. Take a look at any reddit group or the comments and
sort on controversial if you want to see what it looks like. If you defined
controversial as any thing that has a 15% to 55% down vote ratio that would be
fairly specific. Maybe over 60% downvote it may be off topic or spam? It would
be an interesting experiment, for a few days, to change the default from "hot"
to "controversial".

