
Facebook Fired Employee Who Collected Evidence of Potential Political Bias - erentz
https://www.buzzfeednews.com/craigsilverman/facebook-zuckerberg-what-if-trump-disputes-election-results
======
fbthrowaway31FB
As a Facebook Employee who watched this thing unfold:

His arguments and reasoning were edited for tone and posted by others and HR
did not take any of that down. He wasn't fired for what he said, but how he
said it. His post was vicious and demeaning toward other FB employees. It
directly attacked the character and ethical constitution of specific employees
in the company.

That said, HR did later revoke open access to some of the specific details he
used to make his case (conversations around investigating and escalating user
reports). FB gives employees access to much more than most companies, so the
sort of things HR made secret would not be visible at most US companies.

Although I think at FB there are many things you can't say, his actual beliefs
were not bold or rare. Many people say similar things all the time. But you
can't go around attacking people like that.

~~~
baby
Don’t know that specific story but the culture at FB is pretty open and people
ask hard questions all the time on workplace and at Q&As. So it’s sad to see
some people abusing this knowing that not that many companies are open to such
internal dialogues. Tragedy of the common.

~~~
xoxoy
what about the engineer who jumped to his death after a bad review?

~~~
baby
Do you have a specific question? It’s tragic obviously and impacted many of
us.

~~~
shajznnckfke
(different commenter here)

Of the handful of people I know who work at FB, all of them are super smart
and motivated. Half of them have expressed a lot of anxiety, stress, and
imposter syndrome to me over their reviews. I’ve seen this at other companies
when a project is failing but I hear about it at FB from A players who are
working on successful projects. It gives me the impression the managers must
be brutal - like they just push harder than other companies and make people
feel they aren’t good enough to get more output. These are my impressions as
an outsider. Does it seem fair to you?

~~~
ubermon
I think not manager but peers. They are mostly over achievers and somewhat
competitive.

~~~
Infinitesimus
Is that because leadership rewards such behavior? I wonder if pressure would
be the same If working 12hr days is discouraged and emphasis placed on
collaboration and team delivery.

Leadership sets the tone and culture so it's always leadership's fault if
culture is toxic.

------
wiredone
“Potential” Political bias.

Can we just acknowledge for a moment what it’d be like to have a colleague
who’s desperately trying to prove that your work is a “big conspiracy” when
you’re honestly all just trying to do your best.

I see so many of these stories of weirdly rogue political activist type
employees recently surprised when they their employer doesn’t like that
they’re purposely trying to undermine them and their employees work. I’m sure
Facebook isn’t perfect, but I know enough good people who work there to know
they’re not all out to “get” us.

This smells like sour grapes* from an employee who focused more on their plans
to take down their employer than their actual job.

*ed: previously written as “sour apples”...it’s been a long day.

~~~
Mekantis
Sorry, but "good people" who are involved with all the evil things Facebook
does by working for them aren't good people. I know it's hip to absolve
yourself of responsibility by saying "I just work here" or "I didn't work on
that", but all these rationalizations don't fly as long as you're one of the
quiet majority that does nothing to fight what Facebook does.

Seriously, techies seem to act on the same amoral plane as business executives
and I wonder how you people manage to sleep at night.

~~~
wuunderbar
There's 2 things here to keep in mind to go down your train of logic:

1) The company is actually doing evil things

2) The employee is aware of the purported evil things the company is doing,
and agrees with that claim

Now we can debate all we want how evil Facebook is or not, but how many
employees do you think check off box 2?

~~~
otterley
I find it difficult to believe that the average Facebook employee is exposed
to less media coverage about their own company than a member of the general
public is. So the remaining possibilities are that the employee either agrees
with the policy (in which case, we can question their judgment), or they're
just trying to keep food on the table (in which case, we should wonder about
how healthy the job market is).

~~~
gfxgirl
There are more options than you've presented. There is also that the employee
has perfectly valid and logical reasons for believing the policy isn't evil.
In other words, the issue is you see the policies as evil with no possibility
to see otherwise.

I use facebook to connect with my family and friends. I've unfollowed all
overly political friends. I get no real or fake news in my facebook feed. If I
worked at facebook I'd be proud of helping people to stay connected.

~~~
otterley
There’s no question that there are good things about Facebook, but I think a
fully contemplative person would need to balance those things against the
harms Facebook allows to exist on their platform. It’s not a black and white
question of good vs evil, as much as it is “does what this company does (or
not do), on balance, comport with my values of right or wrong?” Evidence of
wrong-doing and questionable judgment by the company’s leaders is abundant.
And at some point you can’t continue to work for a company without being
complicit in its sins; after all, a company hires someone because they need
them to help execute the company’s decisions.

~~~
conradev
What if you replaced “Facebook” with “the US federal government”?

Can you continue to work for the digital service without being complicit in
the government’s sins? There is plenty of evidence of wrong-doing and
questionable judgement by the government’s leaders, for sure.

Say everyone who did take issue with Facebook’s policies decided to leave –
who would be left?

~~~
otterley
The Government is subject to a Constitution that has an amendment process, and
laws are made by elected leadership who can be voted out by the people.

Facebook is not a democracy, and given the corporate structure and differing
voting rights of shareholders, it is practically impossible to replace the
current leadership even if a majority of shareholders wanted to.

Moreover, the Government is immense and highly diverse in its missions
compared to Facebook. I don't think anyone reasonably believes the sins of the
CIA should be borne by, say, U.S. Forest Service park rangers. If Facebook had
a public service mission that, among other things, provided essential services
to the public like food and housing, perhaps the discussion would be
different.

> Say everyone who did take issue with Facebook’s policies decided to leave –
> who would be left?

Some might say that this is the desired outcome, but eventually management
might see the writing on the wall and change the way they do business. Indeed,
this is one of the key mechanisms of unionized labor - to bring management to
the table for negotiation through the threat of work stoppage (striking).

~~~
conradev
I think the government is indeed massive with a diverse set of goals, but I
think employees at Facebook may see their workplace the same way:

\- They could work on something “good”, like trust and safety, community
moderation tools, emergency response, etc.

\- They could work on something user-driven or “neutral”, like Events or
Groups which are used to organize anything from BLM protests to Trump rallies.

If they’re not actively designing privacy anti-patterns or trying to make the
news feed more addictive, quitting would not really affect those problem
areas.

I think organizing at the workplace is a great idea, and the threat of work
stoppage could be a very real one if enough employees organize. But the idea
of telling people to quit their jobs and distance themselves from the problem
is very different from telling them to start organizing to fix the problem.

------
fb5400
This is a rough situation. The guy was pulling on an important thread to
unravel a complex issue and get us all transparency. And then he wrote a
company wide memo that called Mark a "robber baron, slaver, and plunderer",
directly blamed Mark for covid deaths due to his implied support of Trump, and
levied a bunch of other personal attacks.

I can't imagine any person at any company who wouldn't get fired after
publicly speaking to the CEO that way.

It's a shame, because if this guy had an emotional intelligence he could have
personally driven change. Now he's a martyr at best.

~~~
travisjungroth
It’s weird coming into this thread after just finishing Schindler’s List. It’s
like the polar opposite of how to effectively operate in an evil system.

For anyone who believes they’re participating in an evil system: quit,
whistleblow, or sabotage. Don’t be like this guy.

~~~
Miner49er
He literally did all 3? He whistleblew, sabatoged by doing this in a way that
it got to the media and by insulting Zuck, and he got himself fired.

~~~
travisjungroth
Whistleblowing would be working with the media or regulators to get out as
much information as possible. Sabotaging is staying at the company and
thwarting their efforts. What he did (calling attention internally to messages
that were already available for employees to see) barely counts as either of
those. Sure, you could say he effectively quit.

------
exogeny
I can't believe there's an actually a debate at all about what's going on
here.

There are multiple sources indicating that conservative sites work in concert
to amplify, at best, specious content and at worst, complete misinformation.

Facebook selectively enforces discipline on these sites because:

1\. Conservative information (Fox News, Charlie Kirk, Breitbart, etc.) is
consistently shared more than other content

2\. They don't want the optics of making it seem like they're censoring
conservative voices because they're afraid of the blowback, so they're hiding
behind the veil of "It's not our job"

3\. They don't want to lose the ad spend running up to the election season nor
do they want to lose the vanity metrics of DAU/MAU/etc because they're a
publicly traded company

Zuck is a feckless coward, full stop. And while I don't know his personal
political leanings, this non-policy certainly has the earmarks of Thiel and
Joel Kaplan all over it.

~~~
dylan604
1) Why is this? Is it because both sides share it vs just typical echo chamber
sharing? I know it's not "shared more because more people believe it" as the
US is pretty close to 50/50.

~~~
corin_
I don't have data to answer how much of what causes this, but certainly there
seem to be some very savvy ad strategy decisions on the Republican side such
as this:
[https://twitter.com/kevinroose/status/1290739595278028800](https://twitter.com/kevinroose/status/1290739595278028800)

TLDR is that if you know certain subjects will get your base talking about
something else that you aren't allowed to promote (such as a conspiracy
theory), you promote the hell out of real news stories about those subjects
knowing your base will a) share and b) discuss the conspiracy theories in the
comments.

It's a very clever way to use money to boost the conversation around untrue
claims, while technically neither the ad buyer nor seller are responsible
since it's the users doing the lying. And I don't know what the solution is,
fact checking every user comment isn't feasible..

------
solidsnack9000
Taking this together with the recent house hearings and with other material,
like AOC's questioning of Zuckerberg, it seems like leftist people (a) are
distrustful of the tech giants' business and wealth and don't really believe
it's been beneficial; and yet (b) trust them to regulate speech and want them
to do so more aggressively. It's not logical.

~~~
JamisonM
Seems like your (b) is mistaken, they _want_ them to regulate speech (the
"aggressively part let's leave aside) and do so within clear parameters. It
seems fairly clear that they are not trusted to regulate the speech but rather
expected to and are to be closely monitored for how they do so.. the hearings
themselves are evidence of that.

~~~
im3w1l
But if you don't trust them, then how do hearings help? Hearings only ask
questions and but don't create any change?

~~~
erikerikson
Often politics aren't about the person being spoken with but the people
watching the conversation.

------
luxuryballs
“Facebook employees want CEO Mark Zuckerberg to explain what the company would
do if the leader of the free world uses the social network to undermine the
results of the 2020 US presidential election”

this is so silly, and later when the article claims “misinformation”, it’s a
tricky situation to claim you know the 100% truth about everything and
anything that goes against what you believe is considered misinformation that
will undermine an election... totally ignoring the possibility that you might
actually not be correct and that things aren’t as black and white as you think
they are

~~~
newacct583
> this is so silly

Why is it "silly" exactly? Are you saying that Facebook wouldn't allow itself
to be used this way, that the government wouldn't try, or that it's not
possible? Are you contending that the results of an election are not "black
and white"?

~~~
fastball
Clearly the results of an election are not always black-and-white, otherwise
there wouldn't be contested elections.

"Undermine" is obviously a subjective judgement, not an objective one. Which
in turn makes it difficult to legislate "prevent people from using social
networks to undermine elections" in an objective way.

And this is now my personal opinion, but legislation that cannot be enforced
with some amount of objectivity is a recipe for disaster down the road.

------
smsm42
As an example of bias, I've recently seen on Facebook a post that instructed
the readers to shoot law enforcement people in the face. Not exaggerating,
those are the exact words, "shoot him in the face". Facebook told me this is
well within the "community standards". Yet for example Zuckerberg personally
promised to ban any page that promotes protests against COVID lockdown. So,
calling for protesting lockdown - ban, calling for murdering the police - OK.
Sounds like a bias.

------
andrewflnr
Of course this happens because if Facebook actually nukes pages like
Breitbart, right-wingers will claim this is evidence of Facebook's leftist
bias. I see this claim on my feed all the time already.

There is nothing Facebook can do that won't cause a riot. But if there's truly
no acceptable action for Facebook and similar platforms, maybe the only option
in the long-run is for them to cease to exist.

~~~
thehappypm
Not really true. Content policy can take a lot of different forms. There’s a
purely compliant approach — only respecting the legal requirements, like
illegal content — and then there’s an approach of being Big Brother, taking a
censorship role. Facebook seems to want to be at least a little bit of a
censor.

~~~
JamisonM
I believe there are no examples of platforms with any significant reach that
can possibly take a purely compliance approach. Speech permissible by law in
many jurisdictions is well beyond what is tolerable in public forums - or
tolerable in your office meeting room.

~~~
FeepingCreature
There is no such thing as "tolerable" \- there is only what one can tolerate.
To phrase it that way asks the question who claims they're incapable of
tolerating what.

"Intolerable speech" makes it sound like it's the speech's fault.

~~~
r00fus
"Fire in a crowded theater" is intolerable speech, and so is publicly calling
for executions. It's enshrined in jurisprudence.

US only, other nations probably have larger lists of free speech exclusions:
[https://en.wikipedia.org/wiki/United_States_free_speech_exce...](https://en.wikipedia.org/wiki/United_States_free_speech_exceptions#As_educator)

~~~
mardifoufs
>"Fire in a crowded theater" is intolerable speech, and so is publicly calling
for executions. It's enshrined in jurisprudence.

Directly threathening to execute someone maybe. But "Fire in a crowded
theater" has not been part of jurisprudence for... 40 years. The US has very,
very limited free speech restrictions.

[https://www.theatlantic.com/national/archive/2012/11/its-
tim...](https://www.theatlantic.com/national/archive/2012/11/its-time-to-stop-
using-the-fire-in-a-crowded-theater-quote/264449/)

------
comfyinnernet
"Facebook did not answer questions about why a partner manager would cite ad
volume as a reason for not acting against a group of pages."

Oops somebody said the thing out loud.

------
trident1000
This type of thing happens to people on the right so often it doesn't even
make the news anymore.

~~~
sanderjd
The other post asking for examples is downvoted, but I think your statement
could really use some. I can only think of one highly visible one, and it's
interesting how much it reminds me of this one, though on the opposite side.
But more examples would be illuminating.

~~~
sanderjd
I can't reply to the flagged reply anymore, but I see now that I was thinking
of this thread as "within the tech industry", while they were thinking of it
more broadly. Looking up-thread, I see that that my more narrow interpretation
isn't necessarily the right one.

------
solidsnack9000
With regards to right wing pages who contact Facebook to get their flags
resolved, the article seems to suggest it's inappropriate that they reached
out directly -- but the business relationship is between Facebook and the user
with a profile, not the user with a profile and a 3rd party.

 _These and other interventions appear to be in violation of Facebook’s
official policy, which requires publishers wishing to dispute a fact check
rating to contact the Facebook fact-checking partner responsible._

 _[...]_

 _Instead of appealing to the fact-checker they immediately call their rep at
Facebook..._

This is a little like when my package gets lost, and -- very rarely -- the
shipper tells me to contact the post office instead of simply claiming the
insurance and refunding me. These shippers often _don 't_ give me the receipt
I need to show the post office that it was insured -- because telling me to
contact the post office is just another way to say, not my problem. The post
office is likely to say: please contact the shipper...

If the user isn't happy with what Facebook did -- regardless of whose advice
Facebook took -- they have every right to contact Facebook.

------
kanox
Good, companies shouldn't employ people who deliberately smear and sabotage
them.

------
ergocoder
US politics is so fucked

\- Fake news is bad

\- Government taking out fake news is actually bad??

\- But facebook taking out fake news based on its own judgement with
absolutely no due process is obviously okay???

Now it becomes a shouting match to get Facebook to ban posts/people they don't
like.

Who can harass Facebook employees the most will probably win the shouting
match.

~~~
three_seagrass
The very first amendment of the U.S.'s constitution outlies that the
government cannot censor the expression of speech.

There's a big difference between a private company determining what is hosted
on their servers vs. the U.S. government forcing a private company to censor
speech.

~~~
Natsu
What happens when that applies to ISPs determining what goes over their wires?
We have a privately owned public square at this point. Further entrenching
that power seems unwise.

~~~
threeseed
So in your hypothetical world there is some conspiracy between the thousands
of ISPs which will work together to decide what content peoples see.

Even though consumers will simply move to another ISP just like they are free
to move to another website.

~~~
Natsu
Thousands? There are only a handful of ISPs at the top and, oh yeah, the last
time they _made a plan to collude_ , it birthed Net Neutrality.

So I'm not sure what part of it happening before makes it seem unlikely to
happen again.

------
iron0013
The actual title of this article is “Facebook Fired An Employee Who Collected
Evidence Of Right-Wing Pages Getting Preferential Treatment”

~~~
dang
The submitter rewrote it because it didn't fit the 80-char limit.
[https://news.ycombinator.com/item?id=24077718](https://news.ycombinator.com/item?id=24077718)

------
jungletime
A day ago, "Facebook removed from Trump’s official account the post of a video
clip from a Fox News interview in which he said children are “almost immune”
from covid-19." (source WP)

Is this really misinformation, or a false statement, worthy of censoring a
presidents speech. For one, "almost immune", is not the same as claiming
"immune". So its not even like its even a false statement, anymore than saying
I "almost caught" a fish, when you didn't.

The implication is that children have low infection rate.

Regardless of the truth value, of "almost immune", removing the post, also
robs people of valuable information, like for example, if Trump is leaning to
have schools open in September.

~~~
shripadk
My question is why censor the US President? Let people know what Trump says
and make their own judgement. I hate this sort of political censorship by
social media giants. At the same time, you can literally find porn and gore on
these sites which aren't taken down automatically. What with advancements in
AI and all it should be easy to remove right? Nope. These are the videos that
linger on even after you report them. And they have ads running on them too.

If they think people are fools and won't notice it they are so wrong. It is
only going to amplify voices for an alternative platform. My fear is that it
will create platforms divided on political biases rather than a truly free
platform where people from all sides can participate. This sort of
polarization is not healthy.

------
jgacook
Eye-opening how many HN comments are framing this guy as a demented political
extremist. He’s a whistleblower exposing one of the most frightening threats
to modern democracy that exists.

We are beyond the speculation point here. Facebook has demonstrated that it
has the capability to spread misinformation and sway political opinion very
effectively and never for the right reasons. The pandemic is the most recent
example of Facebook’s fuckery: post something that tells people that Covid is
caused by 5G and that Bill Gates wants to implants microchips in you? You
might get a small disclaimer on your FB post saying that “parts of it” “may
be” inaccurate. You are constantly fed anything to keep you engaged with the
platform and increasing Facebook’s bottom line.

Zuckerburg has just spun to Congress that Facebook is going to extreme lengths
to moderate their platform - this employee’s testimony is proof that Facebook
doesn’t care in the least or worse, they may be actively trying to help along
candidates who will be politically lenient towards their behavior. We just
discovered a “bug” caused Republican ads to show at a more frequent rate than
Democratic ads - I don’t know what’s more worrying: that a bug on Facebook
could swing the election or that it wasn’t a bug at all.

People need to stop with the “I know good people at Facebook” B.S. It’s beside
the point. Facebook is a rogue actor that has enormous social and political
power to completely mislead the general public into practically any belief.
They have continued to demonstrate that they cannot be trusted with this
power. This man should be praised as a whistleblower for exposing the charade
that is an unregulated Facebook. We need more like him in tech to show people
how serious the problem is. Give me a break with the analyses on whether he
should have been fired or not and recognize the scope of the problem he is
trying to warn is about. The quality of our democracy is absolutely at stake.

~~~
jmeister
People are free to not use FB. Let’s assign some agency to the users too.

~~~
TheSpiceIsLife
Do you mean to imply that advertising and propaganda _don 't work?_

That doesn't seem like an argument anyone would _want to make_.

------
aspenmayer
For this employee’s sake, I hope they have adequate legal representation, and
that they intend to sue.

> While there are signs Facebook will stand up to Trump in cases where he
> violates its rules — as on Wednesday when it removed a video post from the
> president in which he claimed that children are “almost immune” to COVID-19
> — there are others who suggest the company is caving to critical voices on
> the right. In another recent Workplace post, a senior engineer collected
> internal evidence that showed Facebook was giving preferential treatment to
> prominent conservative accounts to help them remove fact-checks from their
> content.

> The company responded by removing his post and restricting internal access
> to the information he cited. On Wednesday the engineer was fired, according
> to internal posts seen by BuzzFeed News.

------
mattnibs
Counter point: [https://www.projectveritas.com/news/another-facebook-
whistle...](https://www.projectveritas.com/news/another-facebook-
whistleblower-interference-on-a-global-level-in-elections/)

Depending on which media outlet you turn to Facebook is trying to skew the
election towards Trump or Biden. Depending on your political bias you get the
story that fits your narrative. Perhaps our framing is off?

------
three_seagrass
Sounds like the James Damore event for Facebook.

------
known
Listed companies should have comprehensive Whistle blower protection programs;
Govt should amend laws accordingly;

------
xoxoy
is it just me or would it be very hard to work at Facebook these days without
constantly experiencing cognitive dissonance day in and day out?

feels like the only way to enjoy working there is to either be someone who
only cares about the money and prestige and/or convince yourself that the
criticisms against Facebook are invalid

~~~
baby
I can assure you that what you’re imagining Facebook to be has nothing to do
with how Facebook is on the inside. There are way more of these debates and
internal discussions, which reflects that Facebook’s employees have pretty
diverse point of views.

------
lalos
Unrelated but anybody found the timing uncanny between the Instagram TikTok
competition being launched the same week that Trump starts talking about
banning it? That and all the heat Facebook/Zuckerberg have been taking to
enable political ads (compared to Twitter/Dorsey), plus private dinners with
the current administration that is very protectionist? Anyways, it's
interesting to see how the US government will handle all this foreign
established tech companies to handle their data by messing with the free
market.

