How scary it must be to work at a place with such an overwhelming "don't rock the boat" mentality. Leakers everywhere, and Google, FB, and Apple especially risk their jobs and their career to give the public an open look at places which hold overwhelming power over our personal lives.
FB's internal perspective on privacy and goals are vital for the public to know, it shouldn't take the next massive breach of trust to trigger an investigation to learn the detais. A leaker, sorry, I mean someone "without integrity", in 2016 could have done a lot of good.
So I would ask you where you'd rather work? At an employer that trusts you not to leak stuff, or somewhere that doesn't trust you? If it's the latter, you might as well be a contractor.
You'll find this is true in most organizations, not just companies. They want to know if they can trust you with their secrets. It does require some faith that internal debate will help the organization make good decisions most of the time, which admittedly can be a stretch sometimes.
The post in full doesn't read at all like it was to stimulate discussion. It reads more like it was to silence dissent.
If you really wanted to stimulate discussion and gather employee views on this stuff, you'd send a survey round. But the relationship is still asymmetrical between boss and employee.
All you're doing when posting something like this semi-publicly is creating an environment where more quiet conscientious views get shouted down by the loudest voices.
Which makes his response read all that more hollow. Calling it a straw man? The post seems to have been an all out justification for immoral behavior by an executive. I can't imagine a Jr. Engineer or someone fresh out of college with their MA in Stats feeling super comfortable hopping in and going, "Hey this sound unethical and if people saw you saying it they would think we're hella fucked up." I'm relatively low-ranking at another big SV company and the thought of needing to stand up to a high ranking employee like that is more than a little intimidating.
And not just that, but to be expected to "contribute to a discussion" in such a way that all your coworkers can see. I think as somebody who takes objection to that memo I'd probably be more inclined look for alternative work.
That's good though. Boz already made the decision to prioritize connecting more people, despite the costs. He didn't have to tell his employees, but he did. This allows you to make the decision on whether it's worth staying at Facebook.
But I’d sure as hell take that situation over many others I’ve had where my only contact with execs is through occasional, content-free memos.
But yes, the people in power do tend to be heard more in internal debate. (Not necessarily just managers though. Good or controversial writers can also have a lot of influence.) And this does mean more soft-spoken people sometimes don't get an equal voice.
Online discussions are often more heat than light. I don't think internal discussion can be replaced by surveys, though? They're both useful.
There are also problems that equality doesn't fix. As the number of people scales up, the power of each person gets smaller. Filling out a survey when you know it's one of hundreds or thousands tends not to feel very empowering, or even a good use of your time.
They canned an employee for saying something unpopular they disagreed with.
The fact that this went such a different way says something. Maybe that something is "the cultures of Google and Facebook are so different it explains the discrepancy." But maybe it's "Facebook wanted to float this as an ethical trial balloon."
Basically this is the end of the tech world and all those people who used to join these firms because they believed tech would make the world a better place.
Now its going to be pretty much closed communication and minimal interaction internally. IF you have an issue, well tough balls, tech is no longer good for that - god forbid it shows up on HN. If it shows up on the media thats career suicide.
I suspect its probably time for HN to be shut down soon as well.
If he actually believed that stuff, it would be extraordinary scary. I don’t think he does.
Trolling is a terrible leadership technique.
Boz’s pieces over the last few years tend to fall into the “Strong opinions, weakly held” category. I also suspect he argues a point that is stronger in sentiment than he really believes, to help his message stand out.
He can’t come back and say he didn’t mean it. Besides, if he was trying to spark debate, doesn’t that mean he though my it was even potentially justifiable to use unethical practices to drive growth?
In my daily casual conversations with co-workers and friends, the topic is very rarely raised, and only to speculate why the timing of the issue seems to tied to a rise in conservative politics.
I'm not going to make any assertions about the intent of the meme because I don't know the context, but the logic expressed in the meme seems to have been their strategy already to me, from the outside.
Yes, and there were discussions internally about this. One could argue that the recent shift from promoting pages content to promoting your friends content might have been the result of that.
most of the justifications for it to appear as dissent silencing will have to be post fact justifications as a result.
I would never be tempted to suspend disbelief that this one time my opinion, effort, goodwill actually mattered.
I've fallen for the "trust us" scam too many times. Embarrassing. Ever more, the only thing I trust is mutual distrust.
I like the idea of employees having open discussions about company policy and direction, but I would never would have believed such a thing could exist at enormous companies like Facebook and Google. Though, given headlines recently, I'm not sure it will survive much longer.
This just reveals how big the power imbalance is between employees and executives that we’d have to make such decisions.
Speaking up internally when you’re not on the board often doesn’t get much done. At least if it’s a moral objection.
Depending what it is, speaking up externally might not get you more pull with internal decision-makers? Particularly if they feel betrayed.
Or, maybe a big external stink could cause action? Depends what it is.
It's quite possible that neither would work, and then you've burned your bridges for nothing.
That's a false dichotomy. I'd like to work for an employer that would not mind me discussing my work related stuff outside without immediately classifying that as a leak, and I'd like my employer to trust my judgment in knowing what is and what is not appropriate in such discussions.
I left Facebook for a place where the stakes are a lot lower, but information leaks like a sieve.
It’s really disheartening to know that a lot of new coworkers would prefer to leak their “spin” to the press and actively try to damage the company when they don’t get their way.
If it's good for the goose it is good for the gander and companies with this much influence on the world should welcome transparency, not oppose it. And if they do not welcome transparency then we'll have to help them along a bit every now and then.
I think the reality with FB is that the current idealized system is not necessarily the best, and likewise, the occurrence of leaks is not necessarily a sign of impending doom. Sometimes leaks are a necessary symptom for when an organization has gone off the rails and has failed to self-correct. I don’t think any organization enjoys or wants leaks — just as no human enjoys sneezing or diarrhea — but sometimes the temporary discomfort is necessary for long-term health.
It's ironic to see employees complaining about what is essentially a lack of privacy, when the company they work for goes out of its way to convince everybody that privacy is a thing of the past, and in so many words, so does the very Bosworth post they would protect and keep private. Eat your own dog food.
And then one of them says that whoever leaked the post (the whistleblower, is how I would refer to them) lacks integrity. Integrity? You work for Facebook. Has it never occurred to you that maybe you're the baddies?
A corporate code of silence that insists on the absolute privacy of internal communications is similar to the Mafia's code of silence, in which it's considered bad form (punishable by death) to blab to the authorities:
"Omertà is a code of honor that places importance on silence, non-cooperation with authorities, and non-interference in the illegal actions of others."
It will be interesting to see what kind of documents this and other whistleblowers will decide to leak in the future. Facebook needs its own Snowden to expose its inner workings.
 I suspect that the corporate obsession with secrecy we're seeing here is not unique to Facebook. What's unique is the irony of a privacy-destroying company insisting on its own right to privacy.
Employers which still use trust at scale are ignoring their risk analysts. The risk of a secret leaking is proportional to the number of people who know the secret. You can reduce the risk with Stasi-style surveillance, or legal enforcement (e.g. legally classified state secrets), but few people wish to work under those conditions.
It's a false dichotomy because people would rather work for an employer that trusts them with the secrets they need to get their job done, and doesn't trust them with the secrets they don't need, a.k.a. the principle of least access. Openness in organizations is important insofar as people can attain access to information they need when they need it, but not unlimited access to everything, which ultimately reduces organizational trust when leaks inevitably occur.
I'm a contractor, in the same team of FTE devs for a year, it's going really well, but I was a little hurt when I realised they were reading CVs to fill a vacancy in the team without putting me in the loop.
Edit: I understand why they are doing it, I mean I'm from a big consulting company (Alten), but still it stung a little, especially since I'm on pretty good term with the rest of the team.
Personally, I’d rather be in the dark about policy than be schnookered into thinking that I have some meaningful input.
It seems apt that internal debates and posts would leak to the outside world.
Boz fostered this culture by example, publishing internal memos critical of the company. In this case, questioning whether the company's driving mission was the universal good that leadership thought it was when it was adopted. Having such high-profile dissent in circulation gives more cover to individual contributors with a gripe than any amount of policy language would.
Except that, Bosworth's shabby recants aside, in the original post the driving mission was not questioned but reinforced to an extreme, cult-like degree by a high ranking Facebook official. He didn't minced words and sought no compromise: growth at any cost, using unethical methods and to the point of endangering people, if that is what it takes. Growth is a good by definition, regardless what your antiquated, pre-Facebook morals tell you.
This a wide extension of the field where debate is possible and a strong reinforcement for unethical behaviour, "Facebook and Boz have your back and anyone questioning growth is an enemy". What was previously unspeakable, is now under debate, we are debating the degree of acceptable unethical behavior and Boz's position seems to be "to any degree". This was merely 20 months ago, not in the distant past when Facebook was founded.
It's specious to call this an environment of open debate, it's a bold move to the organizational culture of a cult or criminal gang. It's not surprising at all then that the current debate centers on ways to root out the traitors and select employees for "integrity" (unflinching loyalty).
1) This is the state of things today, and the uncomfortable truth of how we got here; what do we do about it?
2) This is how things both are and should be; either get in line or leave.
I, obviously, gravitated towards the first interpretation and you the second. Without further context, I'm not sure there's any way to really know which was intended.
I know this is hard to believe from the outside, but most Facebook employees believe that Facebook can have a positive influence on be world. It’s deeply ingrained in the company’s culture. An executive doesn’t just come out and say “it’s all business, fuck the consequences”.
This is why not leaking things is so important. The context and culture within a company change how a message is interpreted.
It applies to pretty much any tech company that claims to love debate and dissent internally.
Today somewhere around 1/4 of the world's population is Islamic. And there have been a total of 3 Islamic Nobel laureates in the sciences. It's a rather nice demonstration on the question of whether 'geniuses' are born or made. If Allah's hand is not chained, what point is there in seeking to discover these alleged laws of nature?
 - https://en.wikipedia.org/wiki/Al-Ghazali
So, obviously, since the graph is spiking, we can expect to see lots more.
This is my sarcastic way of saying that taking a single metric which is affected by tons of different factors, and applying it to a complex
argument about, basically, sociology/anthropology (human behavior and culture) really doesn’t provide a lot of value.
I think your post opens a door to a lot of interesting conversations, but that using the # of Nobel Prize nominations per religious / cultural group as a metric closed most of those doors.
It’s also not very scientific.
The Nobel Prize is a European institution.
But I think that, in such context, the acceptance of fundamental truths are necessary to have a debate, like mathematical axioms are necessary for proofs. In addition to that the fundamental truths are about one's Faith, so I don't think there's a lot to debate on, either you belive or you don't.
It’s probably worth bearing in mind that in any company that pushes this kind of ‘open’ communication, there’s an unavoidable pressure on most ordinary employees to say the ‘right’ things. A company that has so much of its internal correspondence open and visible to anyone will very quickly descend into 1984 territory. So those ‘dense’ folk bleating about integrity are likely to really be saying ‘I would never leak, boss, you can trust me’. And in all likelihood, the actual leaker is one of those voices. Personally, I find it baffling that so many supposedly intelligent people see an office under the Eye of Sauron as a Good Place to Work.
I read his memo titled "The Ugly." There was nothing in there that was critical of "the company." The only criticality I read was this individual being critical of people who might be prone to self-reflection. Judging from the memo and other employees characterizations of him "Boz" just sounds like a total asshole.
> That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends.
> I know a lot of people don’t want to hear this. Most of us have the luxury of working in the warm glow of building products consumers love. But make no mistake, growth tactics are how we got here. If you joined the company because it is doing great work, that’s why we get to do that great work.
I understand what you and others are trying to say. That he somehow disagreed with his own words, and wrote it to start an internal debate.
But is the reader supposed to know that from some kind of context we can't see? From the words themselves, it seems pretty clear he fully supports the continuation of, in his own words, "questionable" practices.
Not only that, but he told his employees what they are doing is totally justified and to keep doing it, because it was sanctioned by management.
The most common reaction for people is to ignore the moral implications, a la wall street "We are unlocking value".
If boz went 180 degrees and said "welp, thats it growth is over, we have a major disaster in a few years" - the GREATER force would murder him, namely shareholders.
Even now, Facebooks greatest pain is coming from the hit to its wealth, not to the number of uninstalls coming via "deletefacebook".
At this scale and size for a large top tier tech company, the man in charge is expected to not rock the boat. Any course correction occurs slowly, or through crisis.
Apparently we are doing it by crisis
It's great they feel they created a microcosm of openness within the company, but that doesn't seem to have made it act any more morally when it comes to protecting user privacy.
“We connect people. Period. That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it”
This is not dissent, this is not 'rocking the boat', this is not being critical of the company. He's taking their mission to the extreme worst case scenario and saying that even then the mission is justified. It's the polar opposite of the things you assert it is.
FB is covering by saying no one agreed with it and it was only there to provoke discussion. Why delete the post and its discussion then? It's obvious the discussion on the post wasn't critical enough to actually provide cover for these excuses so they burnt the post and are now lying about it.
Because the post and the discussion was leaked publicly.
Unless you were there since 2005, Boz was a higher up with more seniority, so the original memo was more of a put up or shut up piece than an RFC.
That couldn't be further from the truth. Facebook's internal communications happen almost 100% exclusively through Facebook itself, meaning this "memo" was most likely a Facebook post, complete with liking, reacting, and commenting capability from anyone in the company.
Buzzfeed touched on this in their version of the story:
> One former employee who spoke with BuzzFeed News noted that they remembered the post and the blowback it received from some workers at the time. “It was one of [Bosworth’s] least popular and most controversial posts,” the ex-employee said. “There are people that are probably still not in his fan club because of his view.”
It likely depends on where you are in the company. I haven't worked at FB but have a few friends who have in the past. One of them told me something pretty similar to what you experienced, the other had the exact opposite.
So I'm honestly not sure what to believe about their culture. It seems, once a company gets big enough, that culture becomes multiple sub cultures and I'm not sure there is a single culture that drives the company anymore.
For example, how would you feel if you told your spouse about the grudges you have with your friends or work colleagues, then (s)he goes out and tells on you? That wouldn't be a very happy marriage imo.
In society people that spread gossip are marginalized by those that hate gossip, because we have private affairs that we'd like to keep private. It's a natural phenomenon.
And yes, I see the irony of defending Facebook by invoking privacy. I try not to have double standards.
I have a hard time having empathy for Facebook in this situation when their entire approach to user's information has been incredibly disrespectful. Constant TOS changes. Misleading privacy settings. Opt-out rather than opt-in sharing. Dark patterns designed to serve the company rather than the user ... and straight up bad ideas. I deleted my account right after they did the TOS change that Cambridge Analytica took advantage of. (The one where your friends choices would share your information. That was a transparently dumb idea from the get go.)
Facebook surrendered it's right to discuss such things privately when it's willfully kept lax policies on sharing users data. Stuff like this should leak earlier so we can talk about it before, rather than after, awful things happen.
You inspire this sort of debate by thought exercise and ask about actual application - you couch the conversation to direct your staff to stronger ethics.
If this conversation were at Uber, in their self driving car division the consequences of this would be human life. The way to have that conversation, with context would be to couch it in the "trolly problem" - because that would keep the framing.
Ethics, the word is ethics - Facebook is clearly lacking them. Were "dumb fucks" according to FB's chief - and the fish rots from the head down.
And the staff's response "find the leakers" -- funny how many groups of people I find despicable seem to chant this.
It was a case of starting a debate by voicing an extreme opinion.
The memo is exactly and clearly telling employees what to think and do. No questions allowed.
It's an "emperor's new clothes" situation where we all get to play the role of the child.
I think you could write about the ideas contained in Boz's memo in such a way that if the memo leaked you still wouldn't look like huge bleeps.
It's not the conversations that are getting them into trouble. If it were just this memo, then nobody would care.
Their action are getting them into trouble. Leaking memos like this merely offers a window into their souls.
I find these surreal cult-like conversations a lot more off-putting than Facebook's data practices. Those I can understand, these conversations (and the words of these well meaning employees more than those of big bad Boz) make me feel like I need to take a shower. To me, this shows the very worst of intellectually dishonest to the point of delusion, modern day North American culture, and it disgusts me.
Perhaps it could have been worded differently (better?), but I did appreciate the solid direction that was given by the memo. All too often, leadership is unable to give clear guidance because they are too wishy-washy about what the goals actually are, perhaps not even knowing what the goals should be besides making money.
It showed that FB says one thing in public and then does the opposite in private.
>Bosworth distanced himself from the memo, saying in a Twitter post that he hadn’t agreed with those words even when he wrote them.
That is scary as shit. That they think the leaker is the one without integrity and not the executive team.
What terrible people.
The language was kind of funny to me even. Hunting down the leakers to make Facebook great again... the company sounds like the business version of the white house administration. If it's this difficult and requires this much secrecy to convince yourself that what you're doing isn't evil then maybe something is very wrong on a foundational level.
I'm struggling to see why anybody thinks this is a reasonable defence.
There is no indication he didn't believe it. The company's behaviour is consistent with it. He only said "I didn't mean it, it was to stimulate debate", and the classic "You're missing context" (which I am not able to show, of course) after drawing negative PR.
It seems very generous to me to give his recent tweet much credibility at all.
Says the company who's very moral compass is coming into question.
Nazi soldiers following orders to line up minorities in slums and shoot them, or herd them into cattle cars - loyal, yes? Moral? No.
"Loyalty," in that post, to what? To Mark and the investors' bottom line, and tangentially with it the bottom line of you and your fellow employees? Loyalty to this idea of "connecting the world?" Is that really the value of Facebook?
It sounds like a cargo cult.
Don't forget - corporations are people too. I think Facebook's mission might be to connect people with advertisers.
Debate is ok but anything real that threatens the powerbase will be quickly dispatched. The only person who can change Facebook is Zuckerburg and his inner cotorie or strict regulation. But this is not a problem limited to Facebook. Google is worse and there are others like Palantir and a pipeline of companies who would like to take their place.
Ethical behavior from individuals will only have an effect in smaller companies and early startups. When they do not get engineers who agree to unethical practices and when there is pushback they quickly realise they may need to rethink.
But software folks have postured on freedom and liberty endlessly but gone ahead and build some of the creepiest stalking infrastructure ever built without a care for fundamental human values or ethics and thus are not trusted anymore.
People with integrity have done this all the time, people with integrity are standing up and bringing these issues up constantly.
But frankly normal people cant be arsed to give up free services like google and facebook because they dont, cant, and wont afford the costs of those services at full.
Further - I am a dyed in the wool non facebook user who was warning about this from the day it was created.
But Facebook employees are correct in what they say.
They genuinely believe that they must be a force for good, and that their websites will bring people together.
It is the MOST essential thing for these people to be able to talk to each other candidly and clearly while they still believe in doing good and being ethical.
Because once that goes, the ability to say uncomfortable things, the only other option is to become the corporate behemoths that all SV-ites hate. To become a suit.
I can't understand how people on HN are missing this.
Facebook has been regularly an enabler - but for all those years HN has been cool with it.
Now, when the shoe has dropped, people here are displaying the same overreach and lack of nuance that created this scenario in the first place.
Facebook is the least of all evils. People are ALWAYS going to create this miserable form of social networking because its easier and matches human neuro patterns closely enough.
But this is the one time we will have a single institution which is not yet culturally made up of suits, who can institute or make the effort to fail correctly.
Facebook internally discussing this and realizing that there is no hope is more critical than people tearing facebook down.
Having a clear idea of objective reality, of being able to see our actual options as both employees at facebook, and as users of facebook (or friends with facebookers), is our best way forward.
I know that some Team leaders at one Uk telco I worked at where asked to go through DV clearance - That's TS (drug tests polygraph) in USA terms.
It is disconcerting how FB employees have come out in support of these ideas from the memo, though.
For a view on what's different an excerpt from a Guardian piece is below.
“It’s horrifying how much they know,” he told the Guardian, on the condition of anonymity. “You go into Facebook and it has this warm, fuzzy feeling of ‘we’re changing the world’ and ‘we care about things’. But you get on their bad side and all of a sudden you are face to face with [Facebook CEO] Mark Zuckerberg’s secret police.”
The public image of Silicon Valley’s tech giants is all colourful bicycles, ping-pong tables, beanbags and free food, but behind the cartoonish facade is a ruthless code of secrecy. They rely on a combination of Kool-Aid, digital and physical surveillance, legal threats and restricted stock units to prevent and detect intellectual property theft and other criminal activity. However, those same tools are also used to catch employees and contractors who talk publicly, even if it’s about their working conditions, misconduct or cultural challenges within the company.
BUT, I've the read the memo and as I saw it, he says that getting people connected can mean that bad guys will also be connected. But that is life. Terrorist and child molesters will use a smartphone too...but
"This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty."
Note how the focus is not on the morality of honouring users privacy, but on the "morality" of protecting the company. Collecting personal information about billions of people, using "questionable" practices, then selling access to it. Its a "good job". One that the commenter does not want to lose. Understood.
However there are laws to protect companies against employees who leak secrets. Employees sign nondisclosure agreements. Companies can adopt no tolerance policies on leaking to the media. They can terminate employees who violate them. No employee needs to consult a moral compass; the rules are clear. Break them and there can be grave consequences.
On the contrary, there are no equivalent remedies available to users whose privacy has been entrusted to Facebook. There is nothing to keep FB honest. There are no grave consequences for violations of user privacy.
When there is a "leak" of users information, the user is entitled to nothing more than an impersonal apology.
Relative to other businesses, one might go so far as to believe "there are no rules" in the space where FB has operated. Users (who are not the customers of FB) have no recourse; theres nowhere else they can go. Buy.
In all seriousness, it is the user who must hope that every FB employee has a "moral compass". Whether FB employees can trust each other is not what the user wants to know. The user wants to know if she can trust FB's employees.
Many jobs considered "good" require a person to forget about ethics.
Tobacco, oil, pharma, agro, car, ads industries. Now we can also add "social web" to this list.
A person willing to suspend morals in exchange of money is the one you should not trust. Especially if they cannot be held accountable for their actions.
Because hell knows what ELSE they are ready to do.
And they will fight to defend their source of income = "loyalty" = "protecting questionable practices" = "indulging in mafia-like behaviors".
You can't simultaneously hold the opinion that there's no harm in sharing information, while also holding the opinion that there is such thing as a "leak." It just strikes me as so ironic for a company that champions "privacy is dead, live with it," to have to delete its own valid internal debates because of the consequences of lack of privacy (i.e. leak = lack of privacy).
Privacy is looking pretty alive in that neighborhood.
"This must result in minimization of efficient internal communications mechanisms (an increase in cognitive
"secrecy tax") and consequent system-wide cognitive decline resulting in decreased ability to hold onto power as the environment demands adaption."
"Hence in a world where leaking is easy, secretive or unjust systems are hit nonlinearly relative to open, just systems. Since unjust systems, by their nature induce opponents, and in many places barely have the upper hand, mass leaking leaves them exquisitely vulnerable to those who seek to replace them with more open forms of governance."
-- Julian Assange
It's also the justification for keeping courts open to public attendance but barring recording equipment (in Canada).
Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post. In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment. Maybe that would lead to more collaboration on solutions, which is necessary because there are actually some tricky tradeoffs here. But that doesn't give the same dopamine hit as cutting down the tall poppies, right?
My point is that Facebook's ethos, that a post-privacy era can exist and be okay, is betrayed by how they clamor when it's their own privacy. It makes it feel like it's a one-directional relationship.
Perhaps you're suggesting facebook's ethos is actually that a privacy middle-ground can exist, where people can choose what gets shown where and to whom. I can't disprove that that's their ideal, but in light of their data-sharing I have no reason to believe it.
Aside from ideals, we can point out the consequences of Facebook in practice. Facebook is a written medium that preserves everything (even something from 2 years ago) and thus has constructed a system that forces its users to hyper-curate their entire public persona or suffer social consequences, and from a practical perspective their own VP failed to curate sufficiently. So regardless of ideals, if the system punishes discussion then I see that as a problem, as well as an irony when it happens to their own VP.
They can, to a larger degree than most critics seem to realize. I use those controls all the time. I might agree that they're not as prominent or easy to use as they should be, but they exist because people at Facebook cared enough to implement them (which isn't easy or cheap at that scale BTW).
> I can't disprove that that's their ideal, but in light of their data-sharing I have no reason to believe it.
Oh, you mean the data sharing that was curtailed sharply in 2014, and again in increments ever since? Is that continuing effort and foregone revenue "no reason" to believe such a sentiment exists?
> if the system punishes discussion then I see that as a problem
Yes, punishing discussion is a problem. Wait, what is this entire thread, and a thousand like it all spawned by the publication of these posts, about?
Don't pretend they're taking care of it on their own. They're reading text messages in 2018.
>> Yes, punishing discussion is a problem. Wait, what is this entire thread, and a thousand like it all spawned by the publication of these posts, about?
Wait... Are you arguing that it's not bad that facebook stifles controversial opinions on its platform because its behavior creates controversial discussions on other websites...?
There's a power asymmetry here that makes the individual user vulnerable, and that power asymmetry should be countered by demanding transparency.
There's an obvious parallel here between individual citizens and government apparatus.
Those that control infrastructure and institutions shouldn't be enabled to abuse that power. And if they do they shouldn't be surprised when the affected protest!
This argument is illogical, because Facebook forces everyone to sign its ToS to use its services, while nobody forces a Facebook employee to leak internal stuff. Said another way, whether or not I wish to have control over my FB data, FB coerces me to agree that it can do whatever it wants with my data. Its not exactly opt-in, is it? Its far worse, of course, if you consider shadow profiles, because it is even coercing people who didn't even explicitly sign up to the ToS. Unless the leak happened via some kind of coercion (which doesn't seem to be the case), your comment is incorrect.
>> In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment.
What? You mean you care about something, but you just won't do something about it, nor openly tell anyone why you wouldn't do something about it, or even talk about it before the issue blows up? Yep, totally convincing.
>> Maybe that would lead to more collaboration on solutions,
Why do people need to "collaborate" on solutions? What do they get from it? Is Facebook going to pay people a share of the profits? If Facebook is a corporate entity which serves its self-interest against people's self-interest (which they have clearly been doing for a long time), what kind of idiot would suggest the people whose self-interest has been affected should now "come to the table" so "we can all work something out"?
>> which is necessary because there are actually some tricky tradeoffs here.
The only tricky tradeoff here is: should Mark Zuckerberg be the only one who should go to jail, or should the entire company be rounded up? It is quite tricky, I do agree.
>> But that doesn't give the same dopamine hit as cutting down the tall poppies, right?
I don't know about tall poppies, but "culling" the "weeds" is the only way to have a healthy garden.
You're free not to use it. If that opt-in isn't enough, exactly how many levels do you want? If you do choose to use a free service, whether it's Facebook or a public library, you have to consider how it's paid for. Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.
> You mean you care about something, but you just won't do something about it
You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.
> Why do people need to "collaborate" on solutions? What do they get from it?
Ummm ... the solutions, which are not only applicable to Facebook? This is a general problem faced by many companies. The solutions could also be useful to the people who blather about creating a distributed alternative to Facebook. I've been a member of the decentralization and distributed-system community for far longer than Facebook or Y Combinator have existed. I also know something about the scale and connectedness of the data at Facebook. We're multiple basic innovations away from being able to create such an alternative. Wouldn't it be nice if people who actually understand various parts of this can talk and work together? That doesn't become more likely when every discussion is filled with people who only read others' comments enough to find where to insert their own half-baked opinions or insults.
Since you can't seem to count to 2, how about:
1. You let us share your data with others in return for free service
2. You don't let us share your date in return for paid service
>> If you do choose to use a free service, whether it's Facebook or a public library
Well, a public library is tax funded and people outside the library employees have a big say in its inner workings. So you can't get your comparisons correct either.
>> Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.
Perhaps you should complete the thought, because I don't actively use the something
>> You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.
Really, as opposed to your very realistic expectations that everyone should just trust FB employees would have "done the right thing" had they not been caught red-handed? Oh right, because FB knows better what is best for everyone else.
>>Wouldn't it be nice if people who actually understand various parts of this can talk and work together?
This is truly bizarre. So if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt? Let us say you think, "oh, but it might take much longer". Does that automatically adversely affect people more than the damages that can be caused to society via rampant data collection? How can you be so sure? Oh wait, because you must be smarter than everyone else, as you got through the interview.
And finally, it is interesting all the things that you selectively left unsaid (exactly like other FB employees have been doing all the while).
- you don't have the courage (what an ironic handle) to discuss shadow profiles
- you never actually addressed the fact that no one from outside coerced the leak, which made your first comment more rhetorical than substantial
- you cleverly twisted the "collaboration" to be amongst FB employees when clearly the line following tells that you actually meant collaboration between FB employees and its users (dopamine hit for whom, that is? so you are now assuming others cannot read either?)
Personally, I think that might be a good option, but you can't claim to have made it explicit before so your "count to 2" insult is misplaced. I know that the only thing you've ever done since your account was created is bash Facebook (how nice that anyone can check that for themselves BTW), but even in that light such childishness is counterproductive.
> if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt
Total strawman. Nobody said or implied that. There's plenty of knowledge and innovation everywhere, but the amount that can come from Facebook only has to be non-zero to support my point. Several hundred developers who have collectively worked on almost every distributed system you've ever heard of might have an idea or two worth discussing. They might even have a perspective on scaling issues that's highly relevant to the problem at hand but not widely known outside of Facebook and maybe three other companies. Why do you try so hard to throw cold water on any such conversations?
I believe you comment would have been better without name-calling and leap to jail time.
The sheer lack of self-awareness in that statement.
As far as I'm aware, Facebook has never sold personal information to advertisers, because that would be giving away its crown jewels. It might leak it, but doesn't hand it out for sale.
EDIT: Downvoters, please point out where I'm wrong. It looks bad for the community to suppress factual statements.
- They use 'personal' as 'private', or 'information about you'. e.g. I have a fetish, I wish no one to know, it's 'personal'.
- You seem to use 'personal' as 'personally identifiable information'. e.g. if I have this data, I can know trace back to its originator.
The ambiguity can be found in many places around. Most people don't make the distinction. Rather, they think anything private is personal. At the same time, the definition you use can be found in official documents, like GDPR, the new EU Privacy Law (https://gdpr-info.eu/art-4-gdpr/).
Facebook allows clients to purchase personal information. What specific personal information is purchased by Facebook advertisers?
Also, nothing changes the fact that Facebook is clearly harmful to the society in important ways, according to various researches. And they do not care about this fundamental issue, they just want more users and more profit.
The CA scandal did not involve any sale between CA and Facebook. CA sold insights gleaned from data pulled from Facebook's public and free-of-charge Graph API.
> Also, nothing changes the fact that Facebook is clearly harmful to the society in important ways, according to various researches.
I'm not sure how that's relevant here. Arguing that Facebook is a bad company does not make misinformation more correct.
Disclaimer: I worked at FB.
I'm speculating here but it may even possible that they have a greater global reach than the NSA as 2.2 Billion people are in their system using their apps on all their devices from watches to desktops. They record all the personal data, track every website being visited, possibly hot mic phones, and record people's locations. Remember those russian soldiers posting photos from inside the ukraine? Yeah, facebook has that data and intelligence agencies of the cold war era would have drooled for this data.
This is easy to fix. Follow the money and change the incentives. And the fix will allow software folks to continue to posture endlessly as they will not have the power to make a decision.
1. Ban micro targeting by advertisers, only contextual text and location can be used.
2. Platforms like Facebook, Google and their chain of contractors, partners etc cannot offer micro targeting.
3. Ban data aggregation with heavy penalties and imprisonment for companies, marketing individuals and their agencies who try this. This will solve these problems in a jiffy.
That removes the incentive for stalking people 24/7, collecting tons of data and making correlation and inferences.
The thing is so many people, especially governments and political players, crave this power that it cannot be done anymore, that's why building surveillance infrastructure is a bad idea. It will always be abused and why the individual ethics of the engineers and society at large becomes critical but we have already failed that test.
I mean, I consider myself to be a loyal employee but I'm not blind to ethical violations. The way these employees are defending a global multi-billion dollar organization it's almost like they were executives. They'd rather sell out the rest of the world for what? To be a FB engineer until they retire? It's like Facebook indoctrinates it's employees somehow.
I can't think of a single non-managerial employee at my company that wouldn't speak up if we deliberately started violating agreements with our customers and coming closer and closer to breaking the law, and I'm comfortable with that.
They knew FB wasn't the most ethical company around, and that they at times would have to pinch their nose.
In return they would be handsomely financially rewarded.
What we are seeing now is not the response of cult members - it's the response of lots of pragmatic employees/"investors" being protective and worried about their investment. In the worst case they would have irrevocably stained their personal reputations for a relatively small gain.
So to repeat, no - this is not a cult - the employees knew what they were getting themselves into. We should not feel sorry for them.
I didn't go to FB because I thought it's unethical but "money". I went there because I wanted to work on a product that I and every one of my friends/family uses, that helped me a lot when I was getting divorced, etc. Money was OK, but you can make more money than at FB, eg. in Finance, or other special places.
I've never worked at a company that takes data protection as seriously as FB, or has the caliber of people protecting data.
Also impressive was that _every_ week there was a 1-hour QA where any intern/employee can ask Zuck a question (it's open mic). In my time there I've never seen Zuck really duck a question. He stands behind what his company does and believes in the mission, as do most employees.
"it's the response of lots of pragmatic employees/"investors" being protective and worried about their investment" > sorry, this is just silly...
I don't doubt this, but I think Facebook has a flawed culture that allows and encourages employees to use this mindset as a rationalization for unethical things like:
- run emotional experiments on news feeds
- silently logging all Android users' calls and texts
- allow proliferation of fake news
- allow buying of propaganda ads from state actors
- zero safeguarding of data to ensure it wasn't sold by app creators/devs.
Data is central to facebook's business model and the ability to collect, analyze, and sell lots of data (a natural result of the 'big data' hype) became an infatuation for facebook employees.
The Boz memo supports my point - except he cleverly hides it as 'grow at all costs' rather than the underlying 'collect/analyze/sell data at all costs'.
"silently logging all Android users' calls and texts" > I also don't like this.
"allow proliferation of fake news" > I think the "allow" part is disingenuous. It's not like FB people are able to guess all the bad vectors in advance and have advance alerts set up. Also, remember, 2B people are on FB, so there will be a lot of shit, because that's what people are like. I actually think they reacted pretty quickly, after the first time there was a credible attack.
"allow buying of propaganda ads from state actors" > Not sure what you mean? US elections are okay to use ads, right? You're saying other countries shouldn't, if you don't like the gov't? This is a lose-lose on FB I think, either people like you bitch that they're enabling a bad gov't, or they're seen as a censor. Believe me that a lot of smart ppl are trying to figure out what the "least wrong" thing to do is on things like this.
"zero safeguarding of data to ensure it wasn't sold by app creators/devs" > bullshit, they stopped this is 2015. But I agree, the way it worked before was really broken and asked for this to happen.
> bullshit, they stopped this is 2015. But I agree, the way it worked before was really broken and asked for this to happen.
> I've never worked at a company that takes data protection as seriously as FB
Before or after 2015? Because a couple of years does not quite make up for the preceding period starting in 2007 (if I recall correctly) during which FB clearly didn't care.
I'm not claiming FB couldn't have / can't do a better job, you can always do a better job, hire even more people for this, etc. But it was definitely taken very seriously, much more seriously than you'd think from all this bad press. And if you go and work there, you'll be impressed, I guarantee that.
However, what I'm talking about is data protection, the problem here was that app permissions were explicitly too loose [until 2015]. As I said, I also think this was a bad policy, and people are rightly upset. But there's way too much generalization happening in this thread.
Just this week he has ducked questioning by the UK parliament, and opted not to stand behind what his company does. A couple of weeks ago, when this story broke, he opted to just hide for a little bit, issue legal threats against the guardian and the NYT and see if the whole thing would blow over.
> Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And we still connect people
This guy is a VP at Facebook. Words mean things and his words have weight within the company. This alone disgusts me. He could have easily taken the other side of the argument to stir debate and chose not to.
> Leakers, please resign instead of sabotaging the company
I think the level of hubris espoused by these Facebook employees is a much better reason to delete Facebook than anything I've seen so far. In fact, although the data we have gotten is incomplete, it seems to possibly be the general feeling. The focus on growth and profit over any thought of doing the right thing is actually evil, especially when one recognizes that evil is being done.
This company is no longer a small company built out of a dorm room. It is a massive publicly traded company that has revenues and active users in the billions. Despite the current climate, words actually have meaning, especially words greatly amplified on these tools, and these actions have real consequences.
It helps the general public by giving insight about how crazy is the people that builds and moderate the platform used to communicate by millions of people.
Do you think Snowden's leaks were intended to help the NSA?
That blog post was basically Boz publicly acknowledging a personal flaw he hadn't been aware of up until his interaction with Dustin. In other words, calling him a mean co-worker in 2008 would be an accurate characterization, if we rely on his recollection of events when Dustin was still Facebook CTO. Calling him a mean co-worker now would be an unfair characterization.
FWIW, Jeff Bezos told a similar story of how he became aware of his meanness to his grandmother in a commencement address in 2010, though in his own case he was a 10 yo boy when he came to that realization. .
It brings to mind the famous Upton Sinclair quote: "It is difficult to get a man to understand something, when his salary depends upon his not understanding it!"
At best, we only have evidence that the subset of employees who chose to comment on that internal thread and whose comment was chosen by the journalist to be included in the article feel this way.
One of the things that media and aggregation have really accelerated is the cognitive fallacy where we assume the most interesting data points are the most frequent when in reality the inverse is almost always true — common is boring. If you were to go by the news, men bite dogs way more often than dogs bite men. But that's only because "man bites dog" is worth reporting and "dog bites man" is not.
It's a disconcerting system.
evidence of the contrary? What is your evidence to measure the value of Facebook is not outweighted by its costs?
But IMO, this was just an employee who admires Snowden.
Where by "jerks" you mean "people with deep misgivings and the courage to risk their careers by speaking out against one of the world's premier surveillance organizations?" This isn't the kind of thing you do for lulz, and I doubt they're getting paid.
Yes, this cultish behavior is real, but I'm sure there are just as many jaded employees or else this leak would never have happened.
Boz complains that his memo was taken out of context and he doesn't even agree with it anymore, yet everyone is judging! Gasp! Facebook on the other hand totally connects people by creating and selling ad profiles on said connected people. Based on data people shared years ago, out of the context, disagreeing to what they said years ago! And that's a good thing, right? Because it connects people! Ugh.
Based on Facebook's latest reactions to media coverage and memos shared to public I was able to deduce an ad profile which I'd like to sell. Facebook, it appears, might start looking for external psychological and legal counselling, and I might have third parties interested in targeting that circumstance.
"Boz" is the only person I regret introducing to programming. (I convinced him to attend the 4-H project where I taught BASIC)
There are many, including those inside Facebook, that are actively asking for more federal regulations.
I don't do regulation discussions. Brings out too much stupid.
But I will say that this is a survival ploy. If you get regulated, you are now approved by the government. You're good to go, able to operate freely in society. You just have to abide by whatever the regulations are.
Facebook may have a business model that just doesn't work in a society that values privacy, anonymity, and small diverse groups of people with widely-varying mores. This seems to be the lesson Facebook itself is learning now about, well, Facebook. What do we do then? It's not like anybody at Facebook is going to bring that up. They've got a ton of money. What do we do if the existence of Facebook itself is unacceptable?
Not necessarily. I suppose it would be possible to regulate it into nonexistence, by undermining the privacy-invading business model. You could probably get at least part-way there by requiring that profile-based targeted advertising and data collection be explicitly opt-in with informed consent. IIRC, the GDPR has a lot of good stuff about how consent must be obtained. Further regulations could mandate that services and incentives cannot be provided conditionally based on tracking or accepting profile-based targeted advertising, etc.
If the above regulation takes effect, much of the business value of profile data evaporates. Facebook would only be left with eyeballs to shove non-targeted ads in front of, and maybe generalized market research.
If you justify "questionable contact importing practices" then you aren't putting up a straw man. You are talking honestly about your "questionable" decisions and trying to defend them. Thus you should expect outrage from those who realize you've lied to the public using "subtle language" in order to grow at any costs.
In other words, the self-awareness to call out the "subtle language" and "questionable practices" implies that the company pursued those growth strategies despite CLEARLY knowing they were sketchy. This is extremely damning but those in power will try to deflect the problem as if someone leaking it is the issue.
I never would have worked at Facebook before due to their questionable policies. This incident reinforces that and shows that the problems come from the top. Now we'll watch debate there get muzzled in the interest of staying out of the news and growing at all costs. Perhaps a success for shareholders, but a failure for rank and file employees with morals.
Social media disruption disintegrated Libyan society to the point it could no longer function. The Arab Spring was a test firing of a psychological nuke. We should all be concerned that Facebook is planning to attack China, because they will defend themselves.
This kind of collaboration with an authoritarian regime is something that Facebook employees who value freedom might find to be distasteful, but if it furthers the company's lofty goal of "connecting everyone in the world", then, hey, it can be justified.
- They have a huge spam-bot problem that they willfully ignore;
- use aggressive email and notification technique to re-engage users;
- stone-wall traditional publishers to accommodate their distribution techniques
- sell and trade personal user information for advertising
- Have most security and privacy settings buried under submenus, almost intentionally built to ignore
- Have a history of shady apps and games on their systems with questionable value to the user
- Have a culture of growth at all costs, without regard to the substance of that growth.
All of the above was known for years, has nothing to do with the recent breach or 2016's election. I just can't understand why this is a surprise or why we should trust a word FB says.
Zuck also got a free pass on his “young people are smarter” comments. I hope the rampant ageism catches up with FB too.
I respect their level of propaganda skill how they are trying to shape the narrative into that they have been victimized, are fully justified and are not actually at fault.
What impresses me most is how homogenously fanatically unified their culture is.
It’s said a cult is that which can only survive by cutting itself off from reality.
What scares me is that people who are so fanatical about their mission and Corp are responsible for so much real human outcomes... The best lack all conviction the worst are full of passionate intensity.
Now that mirage has a few cracks in it and executives are freaking out.
Reminds me of Westworld.
You leak of trade secret, you can be fired. But if you talk about work practices, that's a legally protected right that these companies don't want you to talk about. If they can make people who do feel ostracized and like they're betraying their family, maybe they'll reconsider sharing what goes on behind the curtain.
I assume you're ok with people being mad about leaked products they've been working hard on for years having their big debut ruined?
So why would employees be mad about the leaking of internal meetings or discussions? Because in the majority of the cases, these leaks don't hurt company management, they hurt employees.
Several times in the past, Googlers have had their physical or mental safety threatened by such leaks. What you see as a company discussion leaked, turns out to be a real life Doxx-ing of employees. Would Damore have been fired if someone hadn't leaked his memo externally? Probably not, he'd be quietly transferred by HR and maybe managed out, but able to preserve his career elsewhere in the valley. The response to his firing was that the internal company meeting questions got leaked and the real names of several female and transgender employees got forwarded to alt-right sites, which proceeded to harass them, send death threats, etc.
And one year, an "innocent" leak of Google's yearly company Christmas present, a brown envelope with cash in it, put the physical safety of Googlers at risk, because Google Shuttle bus stops are well known and Googlers are easily identified in public, and that day, everyone knew Googlers would be carrying hundreds of dollars of cash in brown envelopes.
These leaks often harm employees, not management.
Furthermore, you can see in this letter from Brian Katz, the head of stopleaks, more or less everything discussed here, and certainly, it coming from "management": https://regmedia.co.uk/2017/05/22/katz-letter-google.pdf (Actually, everything here is useful/interesting: https://regmedia.co.uk/2017/05/22/discovery.pdf )
Note that he expressly suggests bringing concerns to a manager rather than speaking publicly about them, which likely includes concerns about working conditions or illegal conduct. He also cites the whole "damages our culture" aspect that is extremely relevant to the topic at hand.
Edit: Removed off-topic addressing of removed content from parent post.
If you leak say, a private thread from internal mailing lists, you could lose your job, especially if it could harm other employees. If you summarize some issue, you probably are on steadier ground.
Google has a vibrant internal culture of criticism. Memegen has already leaked (https://gizmodo.com/5946769/google-workers-make-internal-mem...), execs and products get raked over the coals at weekly company wide meetings, there's actually internal comics that satirize company beliefs.
And yes, management doesn't always listen, probably the single biggest reason unionization might be needed in Silicon Valley, not for worker pay and benefits, which are already good, but to ensure management listens on other issues.
BTW, the TGIF transcript and meme leaks are the kinds that have ended up Doxxing employees and triggering harassment. The gizmodo link I put properly redacts the usernames, but other leaks have not.
People should not say things even in a private corporate setting that they would not be comfortable if was someday public. Not only can a lawsuit or investigation bring internal correspondence to light, but for instance, some of my correspondence could be FOIA'd! People fail to realize their corporate correspondence is not their private place, and treat it accordingly.
I do have a lot of sympathy for targeted folks like LGBT folks getting harassed, particularly after events like Damore and the related lawsuits and embedded posts/content. But in many cases, they are already publicly identifying their RL name, gender identity, and political positions on Twitter, for example. They weren't so much "doxxed", as they became higher profile due to having been in the news.
This sucks a lot, but we can't hide important information from the public based on the unfortunate reality that soul-sucking nightmare trolls chase every name that gets even the briefest of public attention... imagine if we took that approach/mindset with leaks coming out of the White House.
PS - I really wish more of Memegen was leaked/public. I can't count how many times a news story has led to the thought "darn, I wish I could see what Memegen looks like today".
As opposed to doing what else? Emptying bins?
They are employed to spin news for the benefit of the corporation. They have risk assessments and plans-of-action for the most likely eventualties. Their job is to do exactly what they've been doing in response to this leak.
Google, as a corporate entity, definitely did not gain from the leak, and certainly would've preferred it not leak.
Below is an excerpt
McDonalds, and more general retail businesses, Target, Walgreens, ..., call centres, ... those are the ones both installing massive surveillance and actually getting it analyzed. Or at least, enough to pay me loads of money to make it possible, and I've had some conversations, they do this routinely. And frankly, I'm pretty sure that since this is directed by low-level managers (you can't do it otherwise, not at that scale, all tracking is done by the store manager) there will be tons and tons of abuse of these systems. I mean for trivial reasons, like stealing, attempting to make a false accusation of stealing stick, stalking girls (or even men), that sort of thing.
From what I can tell, at FB the entrances and exits, some presumably high value locations inside and to a lesser extent the parking lot, those are the places under surveillance. I'm pretty sure that at most places in the FB buildings nobody can see you on any security monitor (it's one of those open plan offices, not much privacy, easy to get everything under surveillance, but they're not doing it. I've been to several call centres where surveillance was utterly pervasive in the same sort of environment. Although the environment was much, much better at FB. Big open office plan, but the air was perfect, for lack of a better word. It wasn't cramped at all, it wasn't like those call centres at all, no wall smelled, not like smoke, not like oil, nothing like that). And of course you hear stories, like "sneaking" into the office on sunday for a board game because it's an easy place and central for everyone to get together is pretty normal.
Of course I realize that this is "fake" like it is fake at any company. Unless you're the CEO AND own a majority of the shares, the company is not a social club, it is not there to have your back. But they have that vibe going there, and they wouldn't have that vibe if they broke it for trivial reasons.
> legal threats and restricted stock units to prevent and detect intellectual property theft
Protecting yourself against insider threats by giving employees means for a good life ! How UNAMERICAN !
I'm pretty sure at your local supermarket you'd find those same threats, except they're totally pervasive. Every employee will have been threatened, and I assure you, ...
... not with restricted stock units or any kind of reward.
Discussing working conditions publicly is legally protected by worker rights laws.
So yes, protecting yourself against that (by firing people for discussing working conditions, for example) is both un-American AND illegal.
My understanding is that FB does not have a "fear-based" culture that would've prevented leaks, so really the only way they could keep people in line at scale is if there was a cultish element to their onboarding process that makes people "love" FB so much that employees are actually so offended by a leaker to make comments like this.
Google's is run by a guy named Brian Katz, who's been the subject of a lawsuit before: https://arstechnica.com/tech-policy/2016/12/anonymous-google... and the very same has allegedly threatened a bartender who found a prototype Nexus 4: http://www.dailymail.co.uk/news/article-2224589/Google-threa...
Apple security folks have tossed a guy's home looking for a prototype iPhone, escorted by police. All of them, allegedly, had badges: https://gizmodo.com/5836990/lost-iphone-5-investigators-were...
They obviously can intimidate employees into silence, but it's far more useful and beneficial for morale to go for the "loyalty" angle, and make sure employees shame others who leak.
I would imagine people at FB that believe working there is objectively good are having a hard time reconciling that belief with the negative externalities that are playing out in society at large.
If you see working at one of these companies as a status symbol, you'll do whatever it takes to protect that image.
There's a strong selection bias at work here in these posts and I think it's a serious statistical error to assume this represents the median view of Facebook employees.