
Facebook and other companies are removing viral ‘Plandemic’ conspiracy video - aspenmayer
https://www.washingtonpost.com/technology/2020/05/07/plandemic-youtube-facebook-vimeo-remove/
======
aspenmayer
‘Social media companies including YouTube, Vimeo and Facebook are removing a
viral conspiracy theory video because of its claims regarding the coronavirus
pandemic.

‘The approximately 26-minute video was presented as an extremely long
“trailer” for a full-length film titled “Plandemic” and features an extended
interview with Judy Mikovits, a well-known figure in the anti-vaccination
movement who has made various discredited claims about the effects of
vaccines.‘

‘By the time it was removed from Facebook, it had racked up “1.8 million
views, including 17,000 comments and nearly 150,000 shares,” Digital Trends
reported.’ [0]

‘Twitter responded to The Washington Post by stating that a tweet from
Mikovits sharing a different video interview of hers does not violate its
covid-19 policy but that the company has removed the hashtags
#PlagueofCorruption and #PlandemicMovie from its searches and trends
sections.‘

[0] [https://www.digitaltrends.com/news/facebook-will-take-
down-v...](https://www.digitaltrends.com/news/facebook-will-take-down-viral-
plandemic-coronavirus-conspiracy-video/)

~~~
eloff
This movie is trash, and removing it is a good thing. But I'm very
uncomfortable with the power of these private companies to censor and shape
public opinion.

Will we have a Facebook influenced the election scandal similar to what
happened with Russia?

What happens when they silence an unpopular voice that is nevertheless true?
Are we all not therefore poorer as a result?

It feels like a black mirror episode in the making.

~~~
thephyber
YouTube has long been problematic with their removal of content (especially
unsupportable copyright claims), but when it comes to this issue I don't care.
Facebook and Twitter are still struggling to come up with guiding principles
for content moderation and to distribute and standardize them to tens of
thousands of moderators, so I'm willing to give them some leeway.

The core issue is: nobody has an expectation that they are guaranteed their
(stupid) content _must_ be accepted by a social media platform.

So long as a person can make+host their own website or torrent for the
controversial content (yes, I get that this is much higher friction), all that
is lost is the _convenience_ to _find_ the content. I'd wager the Anarchist's
Cookbook and issues of the ISIS magazines are still bouncing around the
internet somewhere, but they shouldn't be available for the average teenager
to stumble upon when their crazy uncle shares/retweets it to the world. It's a
quick way to empower angst and anti-government sentiment among those who feel
powerless (literally the recipe for terrorism).

The only thing I dislike about the removal of this video content is that it
gives the gullible conspiracy theorists the "they are removing it because they
know it's true!!1!" argument.

~~~
eloff
I mostly agree except that convenience is everything. Without the distribution
a video is just a string of bits. if nobody can find it, it doesn't matter
what the content of the video is - it doesn't exist.

~~~
aspenmayer
It seems like a variant of the Library of Babel problem with a Traveling
Salesman twist.

------
JacksonGariety
Why not just serve up a disclaimer video exposing the flaws in 'Plandemic',
and require users to watch that video first? That way, the people who watch
this sort of stuff will be able to learn to think for themselves by resolving
the cognitive dissonance.

~~~
aeternum
Requiring users to watch probably isn't the right move but simply suggesting a
video with the opposite viewpoint would be an excellent move.

With every tech company now using "AI" recommendation engines, users only see
items that reinforce their beliefs. These tech platforms could really benefit
society by simply showing opposing viewpoints once and awhile instead of
constantly reinforcing tribal beliefs.

~~~
Nasrudith
That sounds like any algorithimic implementation would be doomed to spread
misinformation out of a misguided sense of fairness. It sounds very "teach the
controversy".

Falsehood is not equal to fact. Favoring the truth but expecting even a human
process would be foolish, an algorithic one even more so.

~~~
aeternum
The problem with that line of reasoning is it assumes we have some oracle that
can tell the difference between falsehood and fact.

Much important societal change starts out as a fringe theory. Remember that
for much of history the idea that women should not vote was considered a
relatively undisputed fact.

Sunlight is the best disinfectant, society has _never_ been successful in
trusting some oracle or government body to determine what is truth. Why do we
believe/trust the large tech companies to succeed in this?

