
Facebook is harming our democracy - superbatfish
http://www.vox.com/new-money/2016/11/6/13509854/facebook-politics-news-bad
======
scarmig
1) There's an irony that Vox of all companies is complaining that Facebook
rewards click bait and punishes thoughtful pieces that require time to read
and digest.

2) The article takes an interesting attempt at being even-handed: "it's not
just right-wing Trump supporters who are being evil, but also left-wing Bernie
supporters!"

3) The solution that Vox is suggesting is for Facebook to appoint a high
council of editors who approve which stories are worthy of being allowed to be
shared, and which are evil and duplicitous. For some reason I strongly suspect
that the set of stories they want deemed good are those that align with Vox's
editorial philosophy.

4) I'm curious how they would handle something like the Iraq war, where the
two legitimate sides promoted by serious, responsible folks like Ezra Klein
and the NYTimes were liberals who supported a humanitarian war and
conservatives who supported a pre-emptive defensive war. How likely is it that
the people who should have been banned from public Facebook discourse are
those conspiracy nuts who thought the USA should stay out of Iraq?

~~~
remarkEon
> 1) There's an irony that Vox of all companies is complaining that Facebook
> rewards click bait and punishes thoughtful pieces that require time to read
> and digest.

Yeah I had the same reaction. It was also somewhat rich that they cited a
BuzzFeed article as evidence people are gaming our predilections for
clickbait. Though, now that I think about it since it is clickbait, BuzzFeed
would certainly be an expert there. And in a way, Vox also takes advantage of
it. Here's a short selection of my favorite "Vox [explains x] in [<=500
words]" pieces.

[http://www.vox.com/2016/8/29/12692546/obamacare-whats-
wrong](http://www.vox.com/2016/8/29/12692546/obamacare-whats-wrong)

[http://www.vox.com/2016/7/15/12204172/turkey-coup-erdogan-
mi...](http://www.vox.com/2016/7/15/12204172/turkey-coup-erdogan-military)

[http://www.vox.com/science-and-
health/2016/10/4/13155916/201...](http://www.vox.com/science-and-
health/2016/10/4/13155916/2016-nobel-prize-physics)

[http://www.vox.com/2016/9/7/12817566/donald-trump-
immigratio...](http://www.vox.com/2016/9/7/12817566/donald-trump-immigration-
position)

Last one for kicks. To be fair to Vox, they _do_ produce some long-form stuff
and often link it at the bottom of these shorter pieces, but these explainers
are not helpful and, I think, produce a rather shallow understanding of
extremely complex issues.

~~~
scarmig
The ideal article for Vox to write on these lines would actually be something
like "don't expect to get any meaningful understanding of issues from articles
that go viral through Facebook or other social media, and wait at least a
couple months or years until a current event has played out so you have enough
context and expert analysis on it to come to an informed opinion."

Unfortunately, that'd sort of undermine their entire business model.

~~~
majewsky
You don't have to wait months or years. I've moved most of my media
consumption from daily news broadcasts to a single weekly newspaper, which to
me provides the most adequate balance between currentness and
classification/interpretation.

~~~
bogomipz
I do similar via The Economist, may I ask what source you have found that you
feel serves you on a weekly basis?

~~~
majewsky
I'm subscribed to "der Freitag", a German weekly leaning on the left side, but
with a healthy amount of internal debate.
[http://www.freitag.de](http://www.freitag.de)

------
adrenalinelol
The "gate-keeper" media gave us the Iraq war. Anyone who was skeptical at the
time was dismissed as a loon or worse, censored via refusing to cover X's
viewpoint. Vox's argument boils down to: FB regressed in area Y, therefore
it's worse than the system that came before it. This is incredibly short-
sighted thinking, to the point where it comes across as being deliberately so.

I'm not a fan of click-baity articles, but the chance for all voices to at
least be heard trumps (no pun intended) relying on what we may dub the "media
elite" for our news. Time and time again they've shown they aren't the
objective neutral actors in ALL cases they claim to be. Of course the WSJ or
NYT will have a better overall track record than say buzzfeed, but that
doesn't mean there won't be cases in which the the aforementioned elite media
organizations don't try to skew the conversation.

A good example of this is the ND pipeline. None of the established players are
giving it the coverage it would seemingly deserve, without YouTube, and
smaller outlets the majority of the country wouldn't know anything is
happening.

------
natrius
Facebook isn't the problem. Advertising is. Vox can't tell you this without
hurting themselves.

We fund content distribution (Facebook) and content creation (actual articles)
based on how many people are looking at them, so journalists produce clickbait
and Facebook gets you addicted to it. We can call it "maximizing daily active
users" instead of addiction if you like, but it all sounds the same to me. If
you're asking Facebook or journalists to make less money to create a better
world, don't be surprised when they say no.

Instead, individuals must take on the responsibility to fund the production
and distribution of content they find valuable. That won't eliminate bias, but
it'll eliminate the very strong financial incentive embedded in capitalism to
create a compelling alternate reality for each individual on the planet. It is
tearing us apart. Start a recurring donation to a nonprofit news organization
today, then turn on your ad blockers. Not just for intrusive ads. For
everything.

~~~
grubles
No, Facebook absolutely is the problem. See: modifying users' feeds to conduct
a random, secret, psychological experiment on unwilling humans.

[http://www.theatlantic.com/technology/archive/2014/09/facebo...](http://www.theatlantic.com/technology/archive/2014/09/facebooks-
mood-manipulation-experiment-might-be-illegal/380717/)

~~~
yummyfajitas
What's wrong with this? Would it also be wrong if Facebook changed their
algorithm but deployed the update to 100℅ of users?

If not, why is 50℅ somehow bad while 100℅ is ok?

~~~
grubles
Facebook modified certain users' feeds to only show "sad" or "negative" posts
and effectively hide "happy" or "positive" posts, while simultaneously doing
the opposite to others' feeds. Grade A+ psychologically manipulative
experiment. This is why I don't engage in the Facebook nonsense anymore.

------
return0
Its immoral for either facebook or twitter to take sides - and the idea of
letting them edit the news is horrifying. Facebook and twitter each are
monopolies, unlike the diverse traditional press. I find it for example
unacceptable that facebook's CEO has endorsed a candidate. Social media should
be held accountable to act as social lubricant and nothing more in well
functioning democracies.

~~~
norikki
they're private companies that can do what they want on their platforms, but
if fb actively censored my posts i and many ppl i know would immediately stop
using it

~~~
return0
> they're private companies that can do what they want on their platforms

Newspapers are private companies too.

------
adilparvez
Facebook and companies like it will in the long run harm more than just
democracy. See
[https://www.theguardian.com/technology/2014/jun/30/facebook-...](https://www.theguardian.com/technology/2014/jun/30/facebook-
emotion-study-breached-ethical-guidelines-researchers-say).

------
tim333
>Most people today don’t get their news ... They open a social media app

I'm not sure that's true. In a 2016 article Pew Research has TV as easily the
number one source. [http://www.journalism.org/2016/07/07/pathways-to-
news/](http://www.journalism.org/2016/07/07/pathways-to-news/)

I'd be more worried by Fox News than Facebook. Also Breitbart and the Daily
Stormer can be scary experiences.

------
knieveltech
That time when Vox blamed Facebook for the US being inundated with credulous
idiots. Here's a counter-proposal, instead of whinging about Social Media why
don't we make Manufacturing Consent required reading?

Edit: oooh hit a nerve there.

------
WheelsAtLarge
I agree with Vox but they are being too narrow by just asking Facebook's
founder to fix it. I would say that Social Media is a threat to Democratic
system as we know it. It's about to be tested like no time before. With the
ability for everyone to have a voice via social media single minded groups are
easier than ever to create. As we have seen many of these groups are unwilling
to compromise their ideas which makes it likely for chaos to erupt over any
hot issues. We saw an example with the Arab spring and Occupy Wall Street.
They erupted but yet have had no real results. Primarily because there was no
real leadership behind it to move it forward. In many ways you can say that
the revolution made matters worse.

When everyone is upset and there are many points of view there is no common
way to move forward but there are many hot heads that are willing to shoot
first and ask questions later. Imagine a million hot heads without a common
goal but the willingness to fight and we can see chaos without results.

People get upset at the "do nothing congress" because they can't get X done
but people aren't willing to admit that the reason is that voters have sent
individuals with very diverse ideas to try to get things done. Voters are the
ones that are pushing them to not compromise any idea or be punished by being
voted out. I can see a future where one person that can use social media very
well can push people to vote in ways that we consider distasteful now. What
will happen then? Groups will erupt with opposing view and many will be ready
to fight.

We've heard allegations that the voting system is rigged but that's very
unlikely. We have laws and watch dogs that prevent that in any significant
way. We don't have the same for social media but we know that it's possible to
manipulate it, even by foreign powers, and that's not illegal worse yet it's
hard to impossible to prevent. Yet, any thing published whether true or not
becomes true by effect if enough people repeat it. It's hard to even
contemplate how that effects a democratic system.

The founding father created a representative government because they knew that
rule by majority can be as distasteful as government by a monarchy or emperor-
ship. They thought a functioning government needs representatives that can
sort out what's needed for a better nation. With everyone having a voice
that's going to get extremely difficult. Social media is about to let the US
test out its governmental system, lets hope it can pass the trouble ahead. Can
the US stay together as a nation when there is no common ground among the
citizenship?

~~~
norikki
what country are you from?

~~~
WheelsAtLarge
US

------
piotrjurkiewicz
FB already has too much power and its moderators affects 'politically
undesirable' content too much. And they want them to employ human editors to
evaluate stories in addition to that?

One week ago, FB blocked an event page of Independence March, which
traditionally takes place on November 11th (Polish Independence Day) in
Warsaw. This is the biggest mass event of Independence Day, having more than
100k participants each year.

FB also blocked or removed pages of NGOs and political parties, which are
organizing or support the march (some of them had 80k or 170k followers) and
personal accounts of people involved in these organizations.

Then FB went full rage and started to block personal accounts of everyone who
invited or even positively mentioned the Independence March, including for
example the personal account of editor-in-chief of the second largest daily
newspaper in Poland
([https://twitter.com/sjastrzebowski/status/793001362070052864](https://twitter.com/sjastrzebowski/status/793001362070052864)).

The most extreme case was the personal account of a MP, who wrote on his
timeline: "I will be [on the Independence March] along with my family, whether
FB likes that or not."
([https://twitter.com/jakubiak_marek/status/793497135954202625...](https://twitter.com/jakubiak_marek/status/793497135954202625/photo/1))
His profile was blocked for 24 hours after that.

Another case was a personal profile of a retired Intelligence Agency officer,
who revealed in a FB post, that a local coordinator of an anti-government
liberal-left protest movement during the communist period was a colonel of
Soviet-dependent military intelligence agency.

All of this happened just within the last month. FB actions generated a huge
pushback and hit the headlines. Deputy Minister of Justice qualified FB
actions as "censorship". Minister of Digitization tweeted that she "asked FB
management for a talk". Many people started deleting their FB accounts in
protest.

FB got frightened and reactivated the event page of Independence March, but
many nationalist/conservative organizations profiles still remain blocked.

------
Frqy3
When newspapers and television news held influence, most countries had media
ownership laws.

Humans and algorithms will have bias, but a diversity of bias is good.

If industry cannot implement some self regulation, it risks finding itself
with external policy and regulatory controls.

------
partycoder
Churchill said, "Democracy is the worst form of government, except for all the
others". A similar concept would apply to the press and the dissemination of
reported events.

News networks were much worse, they could arbitrarily censor anything that did
not result in a higher TV rating, more advertising revenue, or that didn't
represent the interests of the investors/donors/advertisers... oh, wait. Same
on Facebook to some extent :-)

That's why it's important to decentralize the Internet.

------
dominotw
Censors opinions I don't like - Private company can do whatever it wants

Doesn't censor opinion I don't like - Harming our democracy.

Eg:

Trump is harming our democracy .

Wikileaks is harming our democracy.

FBI is harming our democracy.

Pepe the frog is harming our democracy.

Russia is harming our democracy.

~~~
fixxer
And we're all too stupid to make up our own minds about what/who to believe.
Thankfully, we have Vox to Voxplain it all for us ignorant masses.

------
oldmanjay
I'm sure Vox would like if Facebook were obligated to promote approved
opinions, but I'm not comfortable with _any_ given group of people doing such
sort of approval, so right now I consider an algorithmic approach natively
superior.

~~~
privong
> I'm not comfortable with any given group of people doing such sort of
> approval, so right now I consider an algorithmic approach natively superior.

That is a point the article makes, though. The algorithmic approach is
effectively equivalent to particular group of people doing the approval (or
rejection). What group that is manifests itself via the way the algorithms are
coded and the data that are fed into it.

 _Edit:_ This is similar to the conclusion people are coming to regarding
algorithms for court sentencing[0] and other areas where algorithms are
becoming increasingly used. It's dangerous to conclude that an algorithm is
better just by virtue of it being code. That code (and/or its training data)
may be reinforcing existing stereotypes and biases.

[0] [https://www.engadget.com/2016/06/26/wisconsin-sentencing-
alg...](https://www.engadget.com/2016/06/26/wisconsin-sentencing-algorithm-
faces-court-battle/)

~~~
yummyfajitas
The conclusion that _numerate_ people are coming to regarding algorithms for
court sentencing is quite different from what you describe.

The conclusion there is simply a theorem: it's mathematically impossible to be
both well calibrated (a black and white person receiving the same risk score
have the same probability of committing crime) and also racially balanced
(similar levels of false positives), except in trivial and unrealistic cases.

[https://arxiv.org/pdf/1609.05807v1.pdf](https://arxiv.org/pdf/1609.05807v1.pdf)

See also this WaPo article explaining in simpler terms:
[https://www.washingtonpost.com/news/monkey-
cage/wp/2016/10/1...](https://www.washingtonpost.com/news/monkey-
cage/wp/2016/10/17/can-an-algorithm-be-racist-our-analysis-is-more-cautious-
than-propublicas/)

But the thing is, this theorem applies to _any_ decision process - human or
algorithm. You can't escape mathematical impossibility results just by using
humans (e.g. NP Complete problems are still hard even if done by humans).

Human processes just add additional bias, e.g. the editorial slant that Vox
wants Facebook to add, or the (alleged) bias that Facebook's human editors
added before Facebook went all algorithmic.

~~~
urish
My only disagreement is with: >Human processes just _add_ additional bias

Bias with respect to what? As you say, there is already bias baked into the
data collection and the algorithmic choices.

The bias that human editors introduce is different, but not necessarily
larger, however you even measure it. There are also myriad human choices
behind the choice and deployment details of the algorithm.

An important plus for human editors is greater interpretability and greater
transparency regarding the biases the system ends up showing.

~~~
yummyfajitas
It's been reproduced across many experiments that humans will add bias that
harms accuracy when making decisions. I.e., if x[6] represents race, humans
will systematically wrongly weight x[6].

Machines simply don't do this.

 _As you say, there is already bias baked into the data collection and the
algorithmic choices._

That's not what I said.

What I said is that you can't have collective equality (e.g. same rate of
false positives, lack of disparate impact) and also accuracy (getting the
right answer) except in trivial/unrealistic cases.

Human editors are fundamentally less interpretable and transparent than
machines. You can easily interrogate machines and test for bias; how do you do
that to humans?

Or, to take a historical example, why did colleges switch from algorithms to
humans when the supreme court said that transparent racial bias is forbidden?

