
Ask HN: Current and former FB employees, what's the other side to the story? - adpirz
Another day, another story of Facebook&#x27;s reach overextending, breaching privacy unnecessarily, and generally acting in some form of bad faith.<p>But business is clearly still good and it seems like they&#x27;re still able to recruit plenty of talent. So what&#x27;s the other side here? Internally, are there feelings of consternation about all the negative press or is there a bigger piece that is missed by it?<p>I&#x27;m really curious what the other side is here, especially given FB hasn&#x27;t slowed down at all despite all the press.
======
fbemployee1234
Throwaway for obvious reasons. Opinions are my own.

I joined in the last year. I had major issues with FB before, reflecting
typical HN user stance. I joined because I was curious and their promises
sounded like an amazing place to work.

And it is. They treat us better than any other company in terms of autonomy
and input. Everyone has a tea seat at the table.

And what I’ve seen, the external perceptions don’t match the internal
objectives. I hear real change from leadership, and most of the major issues
are years old.

Fb has invested hugely in protecting elections, possibly more than any single
government. They have doubled that on integrity, etc.

And realistically, I don’t think we know how to balance harm and good in the
real world. I just don’t see tracking online activity as more harmful than the
benefit that people get from the service. (Not so much US users, but the
benefits to people in very poor countries are very real, and wouldn’t exist if
the service didn’t monetize so well)

In short I don’t see them as an evil entity. They’re just a large one that is
easy pickings for negative press. I used to work for the US government in a
health research role; this feels similar.

When something is so massive and decentralized, the “bad press” events are
gonna happen. I think FB, and especially Zuckerberg, have done a great job
responding to these issues to try and solve them.

Remember, what Facebook is doing has never been done before. There are going
to be mistakes.

~~~
warp_factor
Let me disagree on this.

I think You joined Facebook, and they showed you the "Employee side" where
everything is beautiful, with free food and free toys and you became
convinced. Yes, they treat employees super well, nobody contradicts that. But
how does it change anything related to how bad the product is (and all the bad
things it does to people that use it a lot).

It is easy to let the beautiful inner side of Facebook take over the product
side of Facebook that everyone else is seeing from the outside.

~~~
throwawaymath
I get that you have a fundamental disagreement with the opinions expressed by
this commenter, but please don’t invalidate their experience like that.
Saying, “Sorry but no” is a flippant way of dismissing their experience
without actually being sorry for doing it.

It’s frankly not for you to say “what happened” when it comes to their
opinion, which is based on what _they’ve_ witnessed and experienced. Note that
this entire thread does not concern facts, it concerns impressions and
opinions. This isn’t the place to “Actually, ...” what someone else has said.

It strikes me as particularly condescending that you’re telling this person
they only feel this way because of “free food and free toys.”

~~~
warp_factor
You make a good point and I edited my comment. Thanks.

------
badfrog
I worked at FB for three years as a software engineer. The people I worked
with basically believed:

1) The core product is good and useful to people/society

2) Most of the negative articles about Facebook are based on some piece of
truth but go out of their way to make things seem worse and more sinister than
they really are

3) Zuck generally wants to do the right thing, but of course makes mistakes

4) Yes the company does bad things sometimes, but not significantly worse than
Google or Amazon (which are the main places many FB employees would consider
working if they left)

~~~
warp_factor
I disagree with all of those, but more specifically with

>> 1) The core product is good and useful to people/society.

Definitely not. If you had to put a numerical value on Facebook impact on
society, it would be extremely negative I think. You grab attention of weak
users, that end up spending so much time on your platform doing nothing
instead of contributing in things that would benefit society. You create an
illusion that everyone got a beautiful life, pushing some people on the verge
to suicide//depression. And maybe the worst of all, you block free speech
based on whatever private criterias you decide and you allow for mass brain
washing by pushing ideas on your newsfeed. The only positive side is your
utility of connecting people which already perfectly existed before.

The only reason why you get away with all of this is because it is so
difficult to put an actual number on those.

~~~
deanmoriarty
As I said in a previous comment, I'd be curious if this "wasting time instead
of contributing to society" applies also to other entertainment companies that
quite literally waste thousands of hours a year of the average weak person,
such as Netflix, which instead is typically seen as providing very positive
value to society.

~~~
warp_factor
While I also wouldn't put Netflix on a pedestal, I think it is in a very
different category than Facebook:

Netflix, you indeed lose your time but you don't have that feeling that
everyone else is having a wonderful time while you are losing yours on the
couch. Netflix (as TV) provides some healthy entertainment.

Facebook on the other hand provides entertainment by broadcasting how perfect
other people's life are. And this is exactly what pushes people to depression
and suicide. Ask people you know and most of them will admit that after their
daily visit on Facebook they feel a bit burned-out. This effect is also
magnified because you now always keep in mind that you should post a "perfect
picture" on Instagram to please your followers.

~~~
deanmoriarty
This seems like a stretch to me and certainly diminishes the value of your
opening original comment "... that end up spending so much time on your
platform doing nothing instead of contributing in things that would benefit
society".

Personally, I've just as well felt a bit burned-out by watching very popular
TV shows on Netflix portraying the amazing lives of some people as I have with
Facebook, and I would never say Netflix has caused negative value to my life,
just as I wouldn't say that with Facebook, which in my personal experience has
been one of the very best products of this decade.

Certainly agree to disagree.

~~~
warp_factor
I can see your point and definitely agree to disagree.

I think the perverse effect of Facebook are not fully well understood yet. But
I'm fairly confident we will look back at this as something terrible. Facebook
and Instagram also indirectly created the selfie generation, more self-
centered than ever with the goal of getting more and more likes. I don't see
that happening with TV.

And then there is the whole fee-speech//mass propaganda tool// privacy debate
that I'm not even getting into.

------
sxp
Disclaimer: I work for a competitor of FB. Opinions are my own, etc.

There is an aspect of [https://en.wikipedia.org/wiki/Gell-
Mann_amnesia_effect](https://en.wikipedia.org/wiki/Gell-Mann_amnesia_effect)
when hearing stories about FB. I've heard similar stories about various tech
companies from major press outlets and the facts & opinions in the stories
directly contradict my own first hand knowledge.

Given this data, why should I trust press articles about FB when they're
written by the same journalists who are focused on maximizing revenue for
their own companies? Most articles I see are ~10% cited sources and ~90%
speculation.

Two examples of press stories that most technical people know are false are
[https://en.wikipedia.org/wiki/Satoshi_Nakamoto#Dorian_Nakamo...](https://en.wikipedia.org/wiki/Satoshi_Nakamoto#Dorian_Nakamoto)
and
[https://en.wikipedia.org/wiki/Supermicro#Allegations_of_comp...](https://en.wikipedia.org/wiki/Supermicro#Allegations_of_compromised_hardware)
. People with technical knowledge can evaluate those stories and realize that
they're incorrect, but others without technical knowledge probably believe the
newspapers. Similarly, people at FB with insider knowledge probably realize
that most of the press articles about FB are 90% incorrect and that FB is in
the same moral league as all the other major tech companies.

~~~
untog
> they're written by the same journalists who are focused on maximizing
> revenue for their own companies?

That's a pretty huge accusation to casually throw out. Where is the actual
evidence that a journalist writing an article critical of Facebook is doing so
because they are focused on maximising revenue for their own company? Most
journalists have nothing to do with the business side of their company, and
this feels like a way to shut down absolutely any critical article without
further analysis.

Further to that, the two articles you cite have nothing to do with Facebook.
I'd be interested to know what technical knowledge disproves the stories about
Cambridge Analytica or genocide in Myanmar:

[https://www.vox.com/policy-and-
politics/2018/3/23/17151916/f...](https://www.vox.com/policy-and-
politics/2018/3/23/17151916/facebook-cambridge-analytica-trump-diagram)

[https://www.nytimes.com/2018/10/15/technology/myanmar-
facebo...](https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-
genocide.html)

I find the genocide case particularly interesting to talk about because it
doesn't really require much technical knowledge to understand at all - the
military set up fake FB profiles in order to push propaganda and promote
ethnic cleansing. It's a question of Facebook's role in society and how
proactive they should or should not be about moderating the content
transmitted through their platform. I don't think there's a simple right or
wrong answer, and it feels like a shame to dismiss the idea entirely because
the article was written by a newspaper that technically competes with Facebook
for ad impressions, or to dismiss it because of an assumption that the story
is false because other false stories exist in the world.

~~~
ydnaclementine
>Where is the actual evidence that a journalist writing an article critical of
_COMPANY_ is doing so because they are focused on maximising revenue for their
own company?

recent example is bloomberg and their china chip hacking (did that ever turn
out to be true?) [https://www.bloomberg.com/news/features/2018-10-04/the-
big-h...](https://www.bloomberg.com/news/features/2018-10-04/the-big-hack-how-
china-used-a-tiny-chip-to-infiltrate-america-s-top-companies)

where the writers get compensated if their story moves stock prices
[https://www.businessinsider.com/bloomberg-reporters-
compensa...](https://www.businessinsider.com/bloomberg-reporters-
compensation-2013-12)

~~~
untog
Well, you removed the word "Facebook" from the quote there, and the Business
Insider article itself states that the practice is extremely unusual in the
industry.

I'll agree that if Bloomberg was the sole reporter of a controversial story
about Facebook that couldn't be proven I'd be very suspicious. But as far as I
can tell, that hasn't happened.

------
docker_up
Most of what the press reports is a distortion of the truth, or making
something sound worse than it really is.

Sure there are issues but all companies have them, just like Google, Amazon,
etc. There are a lot of good people doing good work here, and connecting the
world for the better. You don't hear about the 99.99% of the good things
because that won't encourage clicks for the news rags.

If the press took an interest in your company, or even you personally, no one
would look good from their constant digging to make you look as bad as
possible. Imagine if your worst moments in your life were publicized for
everyone to see, without seeing all the good things you've done? People would
think you're an asshole as well.

~~~
jmathai
This should be the top comment.

Let's take a current headline: "Facebook is facing criticism for not allowing
users to opt out of a feature that lets people look them up using their phone
number or email address."

There are so many ways to explain how this could come to be and the vast
majority of those would not be attributed to malicious intent.

It's not like PMs at Facebook are sitting around wondering how they can
exploit people's privacy. They work on a massively complex product which sits
on extremely sensitive data. Releasing a feature without properly vetting all
of the ramifications (some of which exist completely outside of your purview
or knowledge) is the reality.

Could Facebook have better cultural practices within the company when it comes
to privacy? Sure. But that's a far leap from assuming that workers there are
intentionally trying to exploit people's privacy.

I'm by no means absolving Facebook of the responsibility they have with regard
to individual and global impact. But I imagine Facebook employees are aware of
the amazing benefit Facebook has provided as much as they are the negative
impact it's had. It's not hard to see how many people would believe the good
outweighs the bad.

~~~
chromecanary
I have huge huge grips with tech news reporting. NYT/WaPo are reasonably
decent usually. You should look at how unbalanced recent tech articles are:

Look at this horrendous article by Josh Constine:
[https://techcrunch.com/2019/02/28/facebook-research-
teens/](https://techcrunch.com/2019/02/28/facebook-research-teens/)

> "So 18 percent of research testers were teens. It was only less than 5
> percent when Facebook got caught. "

Very very colloquial term plus hugely biased+loaded terms like 'caught' are
being used. Even basic Journalistic standards are not being met.

This isn't a new article. That's an opinion piece (and I am not disagreeing
with his facts but with how opinionated his presentation is).

There's a difference between "GSW lose to Celtics 129-95" and "Low energy GSW
show disinterest in the game and get walloped by Celtics 129-95".

And Josh is one of the more popular tech 'journalists'. So you can imagine how
problematic the whole industry is. There's no "news". There's just opinion
pieces - all of which are from a single elite SF/NY/LA bubble

~~~
chromecanary
The latest from Josh Constine: [https://techcrunch.com/2019/03/06/facebook-
living-room/](https://techcrunch.com/2019/03/06/facebook-living-room/)

"Perhaps this will just be more lip service in a time of PR crisis for
Facebook."

"Now Facebook might finally see the dollar signs within privacy."

Shamefully opinionated comments again from Josh.

------
librish
It does feel like a lot of the news are misrepresented or skewed. Take CA as
an example. This was FB acting as a developer portal and the following
happened (to the best of my knowledge) using Apple as a substitute.

1\. Your friend downloads a slick email app from company X on the AppStore

2\. The OS warns you with a big blocking screen that this app wants to read
all emails from your address book, your friend clicks allow and company X
uploads your friend's address book to their server for what they claim are
some cool autocomplete and search features

3\. Apple has also made company X to sign an agreement to not misuse the data
in any way outside of what's needed for "core functionality"

Later it turns out that company X took the data from their servers and sold it
to someone else which means that your email address and any emails your sent
to your friend are now leaked.

Now you can be mad at your friend, company X, or Apple and I feel like so much
of the focus has been on "Apple" ie FB when the main piece of criticism is
that perhaps users shouldn't be trusted with a permission this broad (which
had already been fixed 4 years earlier).

~~~
badfrog
> the main piece of them criticism is that perhaps users shouldn't be trusted
> with a permission this broad (which had already been fixed 4 years earlier).

Agreed. This broad permission was also widely known well before CA. IMO the
only reason the CA story blew up is that lots of people were upset about the
2016 election and wanted somebody to target their anger toward.

~~~
chromecanary
Traditional news media gave the election to Hillary. And post-election they
found it convenient to blame it all on Facebook.

Facebook certainly isn't influential enough to swing 20 points of an entire US
Prez election all on its own (It obviously contribute though)

------
wccrawford
"I'm really curious what the other side is here, especially given FB hasn't
slowed down at all despite all the press."

Shouldn't this whole question be directed at the users of FB instead? If they
don't care enough about this to stop using FB, why would you expect the
developers to care enough to give up part of their fat paycheck?

I personally refuse to work for companies that I think are likely to do
business in a way that I think is unethical. But my ethics aren't the same as
everyone else's, and I can perfectly well understand not caring about most of
the "privacy" scandals that Facebook has if the users don't care.

Everyone is entitled to give up their privacy whenever they want, and everyone
should be aware at this point that Facebook plays fast and loose with that
privacy. If they continue to post on FB after this, I can't really feel sorry
for them.

And for the record, I still use Facebook as much as usual. That isn't a lot,
but it's some. I post only to my friends, and I only post things that I'd be
willing to post on a public forum anyhow. Like any other tool, you have to be
careful while using it.

------
throwaway-1283
The other side is 90th percentile market pay...?

At a certain point in a company's lifecycle, most employees shift from being
intrinsically motivated (by the mission, the team, the leadership) to
extrinsically motivated (by the money, benefits, and resume stamp). FB is no
different, and that shift happened way before 2017. I would guess most people
who happily work at FB did not care before Cambridge Analytica and do not post
Cambridge Analytica.

If anything, I would guess FB employees are feeling relatively better now
given the broad criticism on many tech giants (Amazon with cities/labor
issues; Google with gov't censorship; Amazon/Google/MSFT with DoD
controversies). The main thing I think going for FB is that they don't (yet)
compete for any military business, so no one's accusing them of helping kill
people at least.

------
taytus
I don't see the complexity here. They pay competitive salaries and people
don't give a fuck about what FB does and accept the gig. Simple.

Being able to say: "No, I don't work here because of principles" is something
really nice but not everyone can afford to do.

~~~
JeremyBanks
99% of their engineers could afford to do the right thing. They have plenty of
alternatives that pay _very_ well, but they'd rather sell their soul for a few
more luxuries.

They are choosing to be bad people.

~~~
ascendantlogic
You're discovering the dirty secret that a lot of people generally don't care
about doing the right thing when doing the wrong thing is so lucrative. People
will talk a big game but shove a pile of cash in their face and things get
complicated fast.

~~~
portal_narlish
Capitalism 101

------
fbthrowaway2
I work at Facebook. I may have the opportunity to work on a privacy-related
product which, if implemented, would have prevented at least one recent
headline. I'd like to get people's thoughts: would working at Facebook on such
a product be ethically-permissible? Does working at Facebook in a capacity
that improves privacy for its users absolve me of that sin by association?
What if I only work on such a product for some of my tenure?

------
notacoward
Current Facebook employee. I'll try to keep this short.

Some of the things Facebook has done have made me sad, or angry. I do wish we
could evolve toward that perfect balance of privacy, authenticity,
connectedness, etc. more quickly. OTOH, I do see a lot of good people around
me, many who share those concerns, and I do see progress being made, and I
know none of us will make a damn bit of difference from the outside.

------
Bhilai
I don't work for FB but in my experience, such decisions are most often times
are taken by Product Managers who do not understand privacy implications. They
typically try to bypass security/privacy teams to avoid the hassle of dealing
with them. Also, Engineering teams are sometimes siloed and do not realize
that the data they are messing with is sensitive in nature (MFA phone numbers
in most recent case.) Most of these features aren't designed with explicit
malice, but usually to further a goal of linking more data, generating more
insights, showing better ads and so on and thats what interests most
engineers.

In many cases of-course, they have tend to hide behind user consent - "The
user consented to this very broad privacy policy which allows us to do
whatever we want so it's okay." Having said that, FB did hire a bunch of
privacy experts to fix institutional issues.

~~~
badfrog
> They typically try to bypass security/privacy teams to avoid the hassle of
> dealing with them.

The PMs I worked with at FB definitely wouldn't do that. Having those teams
find a problem after you've built a feature is much more of a hassle than
getting their help to do it right in the first place.

On the other hand, product teams would often fail to get those reviews just
because they didn't realize there could be a problem.

------
fbthrowaway221
Because there is no negative consequences for anything as long as you meet
your deadlines.

And you are fed constant propaganda about how you are making the world a
better place by connecting people and freeing speech. You start to believe it.

And when you bring up ideas which actually make it hard to spread false
information but might reduce engagement metrics, then you are laughed out of
the room. Seriously.

Until Facebook's stock price is affected by actual lawsuits, no one inside
Facebook will ever be fired for generating negative news.

------
EngineerBetter
As a person who has never had a Facebook account, can someone explain what
societal good they've done? I see none.

I see espionage on a massive scale. I see democracies weakened for profit. I
see self-obsessed shallow folks constantly taking selfies to post online. I
see the degeneration of political discourse into pithy statements that can be
'liked'.

What good have they done, exactly? Helped people share photos and status
updates?

~~~
basch
It's a tool right, and from a usability, stability, security, and
functionality perspective its second to none. No other program (for me) works
more seamlessly between desktop and mobile.

It's messaging app is first class. Tons of features. I can type a message on a
pc, and carry my phone with me and pick up where I left off.

It's groups allow people to self organize, at a scale I dont see anywhere
else.

It's newsfeed is what you make of it. If you follow your friends you get what
your friends share. If you follow ars technica, foreign policy, indewire, mit,
open culture, priceonomics, saveur, you get ars technica, foreign policy,
indewire, mit, open culture, priceonomics, saveur. Theres really no one to
blame but yourself and the people you follow if your newsfeed sucks, or isnt
what you want it to be. Follow better people. Fake news, tribalism, viral
videos as the new opiate/soma of the masses, speaks to human nature and what
we have become, not the tool that lets us mindlessly consume and push away our
problems.

It's gmail, google news, twitter, instagram, youtube, reddit, all rolled into
one. For a lot of people who want a stable product that works well, that does
those things, it gets the job done. I totally think facebook wields its power
irresponsibly at times, but blaming it alone for weakening democracy is like
blaming roads for bad drivers.

~~~
EngineerBetter
Thanks for the critique of Facebook as a technical implementation. I was
curious as to what societal good the company and its products have done.

It sounds like "saved people a bit of time by doing all the things separate
services used to do".

~~~
manilamann
What about the millions and millions of small businesses across the globe that
were able to grow, reach their customers beyond the borders of their cities,
states, or even countries, create jobs, create stability, and bring goods and
services to the people that want or need them.

What about the charitable giving products that Facebook builds that has
collected billions of dollars for charities globally?

What about the first of it's kind Crisis Response tools that allows people
involved in natural disasters, terrorist attacks, or other horrible situations
communicate with their family and friends across the globe and let them know
they're safe - for free.

What about the groups functionality that allows people come together to
support one another through cancer diagnosis, or the loss of a loved one
(grief counseling), or coach and help one another land jobs.

I could go on for a little while longer, but the point is the facebook isn't
the only company or service that can do the things I've listed above, but it's
the only one that does all of them and doesn't charge the end user. Let's not
kid ourselves, building those products and the infrastructure that it rests
upon is not free nor easy and requires tremendous talent and resources.

Facebook is a business like any other and it has to make money, and it does so
through ads. So to judge it purely on your experience/use case that may not
include any of the products/features I mentioned above, it may seem like it
doesn't produce a net positive result.

But for those users that have and continue to use facebook in these ways, the
benefits are obvious.

------
warp_factor
I would love for FB employees to explain to me how they justify that their
product literally hooks fragile users to internet, creating chronic
depression, FOMO etc etc.

To me, it is no different than Heroin given to a Junkie.

You just need to travel anywhere in the world and look at all the tourists
spending their time Instagram-ing everything. On the other side of the screen
you have fragile teens that get depressive because their life is not as good
as the fake paradise that can be seen on Facebook//Instagram.

This is the culture that Facebook and "attention-grabbing" social networks
have created. As an employee at Facebook, you directly contribute to that!
Yes, it is a symptom of something more deeply broken in society, but you act
like Heroin to a Junkie, facilitating the intake.

If we had to number this, I would probably put Facebook and other predatory
attention-grabbing networks (Youtube is another one) in my top 3 of worst
things going on in the world right now.

~~~
Aghoree
Lots of things can be harmful or useful based on how you use it. You can learn
many useful things from the internet while there are people looking for child
porn and recruiting terrorists. You can use a car to get places quicker or run
people over. You can use nuclear power to generate clean electricity or make
bombs.

A lot of people including me use Facebook services to stay in touch with my
friends and family most of whom live thousands of miles away from me, which I
find incredibly useful. People use FB groups to form a community around lesser
known causes (like a rare disease or a dying language) and get in touch with
people they might never have found otherwise. At the same time, Facebook can
be used in harmful ways like you mentioned.

Sure, Facebook can perhaps regulate people using it's services to protect
"fragile" people, but don't people deserve the freedom to do whatever they
want with their time? Should we also have laws around how long you're allowed
to use the internet every day because it's addicting and can be harmful?

------
thisisit
Back at my hometown, a small village, we had a gangster. He made tons of money
using his illegal empire. But he also took care of everyone around him. Even
those who were not involved with criminal operations were taken care of.

The press wondered how could these people trust this gangster? What's their
thoughts/feelings about smuggling etc being reported daily?

The answer was - "This so-called gangster has built orphanages and donated to
religious causes. He has been net positive to society. While this gangster
does some bad things, the press is harsher on him. Media knowingly writes bad
stories because they are jealous of his success or have hidden agenda. He just
wants to do the right things and his heart is in the right place. But his is
human and makes mistakes. And before you label him as the worst do look at
those other gangsters. They are just as bad, if not worse, than our hero."

This is a story which repeats itself over and over again.

------
ann3thr3w
depends on the role: [https://www.theverge.com/2019/2/25/18229714/cognizant-
facebo...](https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-
content-moderator-interviews-trauma-working-conditions-arizona)

[https://www.glassdoor.com/Reviews/Facebook-
Reviews-E40772.ht...](https://www.glassdoor.com/Reviews/Facebook-
Reviews-E40772.htm)

------
warp_factor
You want to know the other side of Silicon Valley?

Download the app called "Blind", where people talk openly while being
anonymous.

What you see there is that everyone is trying to maximize their TC (Total
Comp). For most people, that's the only goal. The whole "Let's make the world
a better place" is something that was sold by PR people.

To talk more specifically about facebook, I have seen two attitudes in their
employees:

1)They use Whataboutism when confronted: Basically they will show you that
Google and Apple are also evil, so it is ok for them to work at FB.

2) They openly say that they know it is a bad company, but they treat employee
very well and they are overly well paid so are ok with it.

For me, It makes me lose faith in engineering

~~~
p1esk
There's also a third attitude where people don't think FB is bad. In fact I
bet a lot of people think it made/makes the world a better place.

------
fbthrowaway3
Going anon here, for obvious reasons.

People rationalize it two ways: 1\. Whataboutism. Yeah, well, what about
Google? What about Amazon? 2\. Whatever, they pay me well and the perks are
good.

My view is more nihilistic: Zuck is the same as he's always been. This is the
same dude who said "They trust me, dumb fucks". Don't get it twisted: Facebook
is about data collection and data sales. Everything else is a happy
smokescreen diverting user and regulatory attention to that purpose.
Connecting people and all of that handwaving is a means to that end.

You trusted him/us; dumb fucks. That's basically about it.

~~~
basch
At some point, there is more to care about than money. He has plenty. He has
more power than almost anyone in the world. Now he cares about legacy and
impact on the world. Once he is gone, what is he remembered for, what change
did he lead.

I dont think he is the same person, because he has it all now, and could walk
away and never work another second in his life. He legitimately wants to make
the world a better place for himself to live in.

That said "Never attribute to malice that which is adequately explained by
stupidity." Just because he wants to make change, doesnt meant he goes about
it in the right way. And because of the way the media treats him, everything
out of his mouth is PR approved double speak.

------
fbthrowaway212
A little background: I was an intern at FB just before and during the whole CA
thing unfolded and I'm returning back for another internship, hoping I'll join
full-time one day.

I had multiple offers from other Big-N companies and unicorn startups(some of
them paying more than FB). But here's why I _choose_ to still work at FB:

1) The values and goals of the company and the people at the company are
genuinely positive.

No PMs are going around wondering how can we profit off of innocent people.
Goals for teams are oriented around the "impact", for eg. how many small
businesses are effectively gaining customers due to the new feature X, or is
feature Y actually making people happy or if it's good for them in long term
etc. An example would be Facebook trying to make more "meaning-full
interactions" on the platform while sacrificing on users spending more time on
the platform looking at click-baits.

2) The company and the leadership accept blame and work on fixing the
mistakes.

Unlike some other fruit company, Facebook has always(although sometimes not as
soon as it should have), accepted the blame for the repercussions of being the
disruptive platform it is. People in leadership personally take charge of the
issue and work to make the platform better. I know a lot of people believe
Zuck is pure evil from what they hear, but I don't think a CEO who’s apathetic
would be willing to pull the platform out of countries if the laws there don't
fit with the company's morals and values. (Read -
[https://www.facebook.com/notes/mark-zuckerberg/a-privacy-
foc...](https://www.facebook.com/notes/mark-zuckerberg/a-privacy-focused-
vision-for-social-networking/10156700570096634/))

3) Every individual employee's voice is heard.

Anyone in any FB office around the world can personally ask the whole
leadership any question every Friday in the company-wide Q&A. And most often
than not, these questions are complicated. Misinformation on WhatsApp leading
to lynchings, what are we doing about it? Foreign governments intentionally
interfering in government, did we know about it and if yes what's our plan?

No question is a bad or stupid question. And yes, we get better answers than
those PR statements. Knowing what happened in the technical gory details and
what specific steps are being taken convinces us that necessary actions are
being taken. I don’t know of any instance where the employees had to go on a
strike and sign petitions to know why the company was working on some project
which doesn’t align with the company’s morals.

4) Fb is doing something which has never done before, at a scale at which
barely any company has ever reached.

All of us know how much responsibility we have working on such an impactful
product. But that doesn’t mean we’ll stop trying new things. Those new things
might not be good in the long run, but, there’s only one way to find out. If
we think it’s gonna affect people’s lives positively, we’re gonna try it.
Critique is expected and welcomed, even when it isn’t accurate. This may not
morally align with everyone, but this is what I generally saw while working
there and feel like it’s something I align myself with personally. Not being
afraid to use the influence to try to launch something new and disruptive is
not something everyone can agree with. And FB historically hasn’t done much
research before launching products, but now seems to be taking the eventual
possibilities seriously. It still doesn’t change the whole “hacking” culture
of just trying out new stuff.

5) FB does care about privacy. Shocking, right?

From the limited 3 month view I have of the company, everyone is crazy about
privacy. There are continuing discussions on how to deal with complicated
situations, and teams dedicated to consulting other teams who might not have
the necessary overview regarding the possible effects. Encryption is a hot
topic, and research is being done on how bad actors can still be detected and
reported based solely on the metadata, keeping the information private. And
there’s no easy way to explain it, but media lately has been extremely biased
against FB and reporting stories in a manner which distorts the reality of the
events. Every small mistake(deliberate or not) will be exaggerated with click-
baity headlines. I do believe that FB is completely responsible for whatever
happened with CambridgeAnalytica, but reading some of the articles detailing
it just made aware of the lack of technical knowledge and the intention to
find out the bigger picture in most of the mainstream media.

6) We see the positive impact happening real time.

No newspaper will publish on the front-page about the FB communities which
allowed LGBTQ people to connect in a country where’s it’s a crime. Or the
fundraisers saving kids who got separated from their parents. Or the
relationships which only came to existence because of the platform. Or the
small businesses flourishing who previously didn’t have a chance to compete
with the big dogs. Don’t get me wrong, bad things still do happen. There are
instances of bullying, misinformation, stalking and a lot more. And people at
FB are genuinely trying to limit that. But in the big picture, most of us do
believe that we’re having a positive impact on the world. And if it’s not,
we’re willing to change it, regardless of the company’s profitability.

TL;DR: Obviously I’m biased, I like that FB isn’t afraid to make mistakes and
own then, genuinely think FB is doing good in the world and believe that it’s
moving in the right direction.

