Hacker News new | past | comments | ask | show | jobs | submit login
Internal Facebook posts of employees discussing leaked memo (theverge.com)
486 points by coloneltcb on Mar 30, 2018 | hide | past | web | favorite | 404 comments



The Guardian has a piece that discussed the FB employees reaction to this leak and it was down right scary. Many calls for hiring workers "with integrity" on talking about how this leak was destroying FB's perception as a great place to work.

How scary it must be to work at a place with such an overwhelming "don't rock the boat" mentality. Leakers everywhere, and Google, FB, and Apple especially risk their jobs and their career to give the public an open look at places which hold overwhelming power over our personal lives.

FB's internal perspective on privacy and goals are vital for the public to know, it shouldn't take the next massive breach of trust to trigger an investigation to learn the detais. A leaker, sorry, I mean someone "without integrity", in 2016 could have done a lot of good.


I don't know about Facebook in particular, but what you're missing is that the reason employees can debate internally about policy is that they trust it won't leak externally. The risk of leaks eventually results in companies clamping down on security, so most employees aren't told anything that's not already public, unless they need it for their job. (Much like Apple has been all along, where employees only know what they need to know.)

So I would ask you where you'd rather work? At an employer that trusts you not to leak stuff, or somewhere that doesn't trust you? If it's the latter, you might as well be a contractor.

You'll find this is true in most organizations, not just companies. They want to know if they can trust you with their secrets. It does require some faith that internal debate will help the organization make good decisions most of the time, which admittedly can be a stretch sometimes.


I think you're missing the greater context here. Bosworth's memo is classic end justifies the means. This is almost an admission that they knew things they were doing would be deeply unpopular.

The post in full doesn't read at all like it was to stimulate discussion. It reads more like it was to silence dissent.

If you really wanted to stimulate discussion and gather employee views on this stuff, you'd send a survey round. But the relationship is still asymmetrical between boss and employee.

All you're doing when posting something like this semi-publicly is creating an environment where more quiet conscientious views get shouted down by the loudest voices.


>It reads more like it was to silence dissent.

Which makes his response read all that more hollow. Calling it a straw man? The post seems to have been an all out justification for immoral behavior by an executive. I can't imagine a Jr. Engineer or someone fresh out of college with their MA in Stats feeling super comfortable hopping in and going, "Hey this sound unethical and if people saw you saying it they would think we're hella fucked up." I'm relatively low-ranking at another big SV company and the thought of needing to stand up to a high ranking employee like that is more than a little intimidating.


> the thought of needing to stand up to a high ranking employee like that is more than a little intimidating.

And not just that, but to be expected to "contribute to a discussion" in such a way that all your coworkers can see. I think as somebody who takes objection to that memo I'd probably be more inclined look for alternative work.


Same. I'm not even THAT low-ranking, and it's still a big deal when I push back against a VP or director on something that's in my area. For some topic where I didn't even feel like an expert, against such language, and with morals concerned? I think I'd just silently start looking for a new job.


> I think I'd just silently start looking for a new job.

That's good though. Boz already made the decision to prioritize connecting more people, despite the costs. He didn't have to tell his employees, but he did. This allows you to make the decision on whether it's worth staying at Facebook.


So basically, he’s admitted to unethical behaviour? Is this why his heart is breaking?


It's good for that employee, but it doesn't do anything for the billions of people using Facebook.


I had disagreements with Boz via text while I was a rank and file employee at Facebook. It didn’t change his mind, and I still thought he was incorrect.

But I’d sure as hell take that situation over many others I’ve had where my only contact with execs is through occasional, content-free memos.


I'd guess that sending a one-sided bombshell like that would tend to stimulate debate rather than suppressing it, at least in a company with a tradition of lively internal debate. Maybe in more top-down companies it's different?

But yes, the people in power do tend to be heard more in internal debate. (Not necessarily just managers though. Good or controversial writers can also have a lot of influence.) And this does mean more soft-spoken people sometimes don't get an equal voice.

Online discussions are often more heat than light. I don't think internal discussion can be replaced by surveys, though? They're both useful.

There are also problems that equality doesn't fix. As the number of people scales up, the power of each person gets smaller. Filling out a survey when you know it's one of hundreds or thousands tends not to feel very empowering, or even a good use of your time.


I guess I'm comparing and contrasting this with the infamous Google sexism memo.

They canned an employee for saying something unpopular they disagreed with.

The fact that this went such a different way says something. Maybe that something is "the cultures of Google and Facebook are so different it explains the discrepancy." But maybe it's "Facebook wanted to float this as an ethical trial balloon."


They canned an employee for creating an unholy PR shitstorm outside of the firm.

Basically this is the end of the tech world and all those people who used to join these firms because they believed tech would make the world a better place.

Now its going to be pretty much closed communication and minimal interaction internally. IF you have an issue, well tough balls, tech is no longer good for that - god forbid it shows up on HN. If it shows up on the media thats career suicide.

I suspect its probably time for HN to be shut down soon as well.


No they didn't, they didn't can the leaker that actually created the shitstorm, they canned Damore to appease the outraged.


I must admit, when I read it, it did absolutely read to me like a contrarian attempt to start debate. I find it hard to believe that anyone would post sentiments along the lines of ‘we connect people, so what if someone commits suicide’ without it being a deliberate attempt to start debate.

If he actually believed that stuff, it would be extraordinary scary. I don’t think he does.


"The risk of sarcasm is that you're taken seriously."

Trolling is a terrible leadership technique.


Have you read any HN’s opinions on self driving cars? There is a strongly held belief that individual deaths don’t matter as long as, on average, deaths go down. You can argue the connecting people is inherently good, so how are Boz’s opinions any different?

Boz’s pieces over the last few years tend to fall into the “Strong opinions, weakly held” category. I also suspect he argues a point that is stronger in sentiment than he really believes, to help his message stand out.


The problem is, you might think that but this is senior leadership putting out an email that is setting the general culture and direction for the company. He’s explicitly recognising and endorsing “questionable” (his words!) decisions made.

He can’t come back and say he didn’t mean it. Besides, if he was trying to spark debate, doesn’t that mean he though my it was even potentially justifiable to use unethical practices to drive growth?


I agree it was very poorly judged. Regarding your last point, I honestly assumed that he had seen stirrings of this kid of reasoning within the company and was attempting to ridicule it.


Are they truly unpopular? The events in question happened several years ago, and while this specific incident wasn't known, it was well-understood that FB profiting from harvesting personal data.

In my daily casual conversations with co-workers and friends, the topic is very rarely raised, and only to speculate why the timing of the issue seems to tied to a rise in conservative politics.


The consequentialism is not the talking point in the meme. What's controversial is his valuation of "the ends": that "connecting people" has greater utility than the life or happiness of a minority of those connected.

I'm not going to make any assertions about the intent of the meme because I don't know the context, but the logic expressed in the meme seems to have been their strategy already to me, from the outside.


> This is almost an admission that they knew things they were doing would be deeply unpopular.

Yes, and there were discussions internally about this. One could argue that the recent shift from promoting pages content to promoting your friends content might have been the result of that.


The post will always look different to outsiders wont it?

most of the justifications for it to appear as dissent silencing will have to be post fact justifications as a result.


Perhaps you would send a survey, Bosworth chose for a different option. Internal memos leaking out to the public without context is a classic case of causing FUD.


I prefer the distrustful organization.

I would never be tempted to suspend disbelief that this one time my opinion, effort, goodwill actually mattered.

I've fallen for the "trust us" scam too many times. Embarrassing. Ever more, the only thing I trust is mutual distrust.


I've never worked for a company that gave employees a forum to talk openly. Company policy was always set by upper management behind closed doors and broadcasted down to everyone else. Discussions and disagreements were handled privately.

I like the idea of employees having open discussions about company policy and direction, but I would never would have believed such a thing could exist at enormous companies like Facebook and Google. Though, given headlines recently, I'm not sure it will survive much longer.


How about a company where decisions are shared internally and employees feel empowered enough to speak up, *externally if needed, against what they see as an injustice when internally nothing is done? That shouldn't be too much to ask for. Perhaps some tech unions are needed to enforce this, as shareholders and owners would probably rather have just the dichotomy you presented.

This just reveals how big the power imbalance is between employees and executives that we’d have to make such decisions.


When you say speak up, do you mean internally or externally?


I thought it’d be clear but I added clarification.

Speaking up internally when you’re not on the board often doesn’t get much done. At least if it’s a moral objection.


If you want to get something done, you need to think about who gets to make the decisions and how to influence them.

Depending what it is, speaking up externally might not get you more pull with internal decision-makers? Particularly if they feel betrayed.

Or, maybe a big external stink could cause action? Depends what it is.

It's quite possible that neither would work, and then you've burned your bridges for nothing.


> At an employer that trusts you not to leak stuff, or somewhere that doesn't trust you?

That's a false dichotomy. I'd like to work for an employer that would not mind me discussing my work related stuff outside without immediately classifying that as a leak, and I'd like my employer to trust my judgment in knowing what is and what is not appropriate in such discussions.


Discussing stuff outside work is different from leaking internal communications verbatim.

I left Facebook for a place where the stakes are a lot lower, but information leaks like a sieve.

It’s really disheartening to know that a lot of new coworkers would prefer to leak their “spin” to the press and actively try to damage the company when they don’t get their way.


It all depends on what the public interest angle is. In the case of Facebook the hypocrisy on display borders on the unbelievable. Facebook arguably infringes in the worst way possible on the privacy of a very large chunk of humanity but is highly offended when its own 'private' communications are exposed.

If it's good for the goose it is good for the gander and companies with this much influence on the world should welcome transparency, not oppose it. And if they do not welcome transparency then we'll have to help them along a bit every now and then.


Facebook doesn’t maliciously expose private user data in order to inflict harm on people.


No, they do it to make money. But that's all the same to me.


Employees would probably feel more comfortable about debating opinions regardless of company trust if their opinions didn't involve Orwellian kinds of user manipulation and "questionable practices".


Unfettered leak-free debate, or, heavy secrecy? Seems like an easy choice, especially on paper, just like unlimited vacation vs. 3 weeks paid — the devil is in the implementation and unforeseen consequences.

I think the reality with FB is that the current idealized system is not necessarily the best, and likewise, the occurrence of leaks is not necessarily a sign of impending doom. Sometimes leaks are a necessary symptom for when an organization has gone off the rails and has failed to self-correct. I don’t think any organization enjoys or wants leaks — just as no human enjoys sneezing or diarrhea — but sometimes the temporary discomfort is necessary for long-term health.


So you could say, they value their privacy and don't want their comments shared with people they didn't authorize?

It's ironic to see employees complaining about what is essentially a lack of privacy, when the company they work for goes out of its way to convince everybody that privacy is a thing of the past, and in so many words, so does the very Bosworth post they would protect and keep private. Eat your own dog food.

And then one of them says that whoever leaked the post (the whistleblower, is how I would refer to them) lacks integrity. Integrity? You work for Facebook. Has it never occurred to you that maybe you're the baddies?


The word "integrity" may have a different meaning inside a tightly-knit corporate culture[1] than on the outside, just like "honor" means something different inside the Mafia, where it means "you can steal and murder, but above all, keep your mouth shut".

A corporate code of silence that insists on the absolute privacy of internal communications is similar to the Mafia's code of silence, in which it's considered bad form (punishable by death) to blab to the authorities:

https://en.wikipedia.org/wiki/Omert%C3%A0

"Omertà is a code of honor that places importance on silence, non-cooperation with authorities, and non-interference in the illegal actions of others."

It will be interesting to see what kind of documents this and other whistleblowers will decide to leak in the future. Facebook needs its own Snowden to expose its inner workings.

[1] I suspect that the corporate obsession with secrecy we're seeing here is not unique to Facebook. What's unique is the irony of a privacy-destroying company insisting on its own right to privacy.


That's a false dichotomy. It's better to work somewhere where transparency isn't a problem.


Exactly.


> So I would ask you where you'd rather work? At an employer that trusts you not to leak stuff, or somewhere that doesn't trust you? If it's the latter, you might as well be a contractor.

Employers which still use trust at scale are ignoring their risk analysts. The risk of a secret leaking is proportional to the number of people who know the secret. You can reduce the risk with Stasi-style surveillance, or legal enforcement (e.g. legally classified state secrets), but few people wish to work under those conditions.

It's a false dichotomy because people would rather work for an employer that trusts them with the secrets they need to get their job done, and doesn't trust them with the secrets they don't need, a.k.a. the principle of least access. Openness in organizations is important insofar as people can attain access to information they need when they need it, but not unlimited access to everything, which ultimately reduces organizational trust when leaks inevitably occur.


> At an employer that trusts you not to leak stuff, or somewhere that doesn't trust you? If it's the latter, you might as well be a contractor.

I'm a contractor, in the same team of FTE devs for a year, it's going really well, but I was a little hurt when I realised they were reading CVs to fill a vacancy in the team without putting me in the loop.

Edit: I understand why they are doing it, I mean I'm from a big consulting company (Alten), but still it stung a little, especially since I'm on pretty good term with the rest of the team.


This sort of post isn’t a debate. It’s a pep rally.

Personally, I’d rather be in the dark about policy than be schnookered into thinking that I have some meaningful input.


Oh the irony. People trusted Facebook to keep their data secure and it 'leaked' to C.A.

It seems apt that internal debates and posts would leak to the outside world.


I was a Facebook employee for several years; I left shortly before this memo was drafted. The environment was pretty much the opposite of "don't rock the boat." Dissenting opinions were encouraged and openly discussed, but everyone understood that could only happen if it didn't undermine the PR department's job.

Boz fostered this culture by example, publishing internal memos critical of the company. In this case, questioning whether the company's driving mission was the universal good that leadership thought it was when it was adopted. Having such high-profile dissent in circulation gives more cover to individual contributors with a gripe than any amount of policy language would.


> In this case, questioning whether the company's driving mission was the universal good that leadership thought it was when it was adopted.

Except that, Bosworth's shabby recants aside, in the original post the driving mission was not questioned but reinforced to an extreme, cult-like degree by a high ranking Facebook official. He didn't minced words and sought no compromise: growth at any cost, using unethical methods and to the point of endangering people, if that is what it takes. Growth is a good by definition, regardless what your antiquated, pre-Facebook morals tell you.

This a wide extension of the field where debate is possible and a strong reinforcement for unethical behaviour, "Facebook and Boz have your back and anyone questioning growth is an enemy". What was previously unspeakable, is now under debate, we are debating the degree of acceptable unethical behavior and Boz's position seems to be "to any degree". This was merely 20 months ago, not in the distant past when Facebook was founded.

It's specious to call this an environment of open debate, it's a bold move to the organizational culture of a cult or criminal gang. It's not surprising at all then that the current debate centers on ways to root out the traitors and select employees for "integrity" (unflinching loyalty).


I've read it over a few times again, and think I know why it's so divisive. In the memo, he describes a state of affairs, with two possible subtexts:

1) This is the state of things today, and the uncomfortable truth of how we got here; what do we do about it?

2) This is how things both are and should be; either get in line or leave.

I, obviously, gravitated towards the first interpretation and you the second. Without further context, I'm not sure there's any way to really know which was intended.


It seemed more declarative to me. It seemed more about clearly delineating the ugly parts of a pre-existing ideology, not suggesting that there be any change, but that people should acknowledge the consequences of pushing the "connecting people" philosophy.


When viewed in the context of a conversation of whether “the company's driving mission was the universal good that leadership thought it was when it was adopted.” can you see how Boz’s post could move conversation in a positive direction?

I know this is hard to believe from the outside, but most Facebook employees believe that Facebook can have a positive influence on be world. It’s deeply ingrained in the company’s culture. An executive doesn’t just come out and say “it’s all business, fuck the consequences”.

This is why not leaking things is so important. The context and culture within a company change how a message is interpreted.


This falls within: “The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum....” ― Noam Chomsky

It applies to pretty much any tech company that claims to love debate and dissent internally.


Reminds me of Orthodox Judaism (other religions may be similar, but it's what I grew up with). Intense debate was highly valued and encouraged, but as soon as you questioned the fundamental truths that the belief system was founded on, you went too far. e.g. questioning that the bible was written by God, or whether God even exists.


That in turn reminds me of Al-Ghazali [1], and the impact he had on Islam. Unable to resolve why some things seemed to contradict Islamic beliefs, he developed and successfully spread the view that there's actually no such thing as causality or logic -- that every single thing is an independent act of god. In other words a leaf does not start burning when exposed to fire because it reaches a certain temperature (speaking loosely), but rather because god decided he'd set it alight at that exact moment. And of course that ash is not created by the fire, but instead by an instantaneous decision by god to turn the burnt object to ash. By rejecting any and all causality, he was able to dismiss all logical issues by simply asserting that causality and logic are social constructs. And that belief spread like wildfire, as such rationale that offers easy explanations for uncomfortable to accept phenomena is wont to do...

Today somewhere around 1/4 of the world's population is Islamic. And there have been a total of 3 Islamic Nobel laureates in the sciences. It's a rather nice demonstration on the question of whether 'geniuses' are born or made. If Allah's hand is not chained, what point is there in seeking to discover these alleged laws of nature?

[1] - https://en.wikipedia.org/wiki/Al-Ghazali


Over half the number of Muslim Nobel laureates (sciences and more), according to that Wikipedia link, have occurred since the year 200.

So, obviously, since the graph is spiking, we can expect to see lots more.

This is my sarcastic way of saying that taking a single metric which is affected by tons of different factors, and applying it to a complex argument about, basically, sociology/anthropology (human behavior and culture) really doesn’t provide a lot of value.

I think your post opens a door to a lot of interesting conversations, but that using the # of Nobel Prize nominations per religious / cultural group as a metric closed most of those doors.

It’s also not very scientific.


Please keep the crypto racism off HN.

The Nobel Prize is a European institution.


I'd say it's the same in other religions (from my personal anecdata with catholicism).

But I think that, in such context, the acceptance of fundamental truths are necessary to have a debate, like mathematical axioms are necessary for proofs. In addition to that the fundamental truths are about one's Faith, so I don't think there's a lot to debate on, either you belive or you don't.


Spot on. It's an effective technique to give people the illusion of having explored all possible options and arguments.

Relevant: https://en.wikipedia.org/wiki/Overton_window


That quote is meant to be applied towards government and society. It doesn't make sense when you apply it to companies. Most employees are passive and obedient as long as they get paid. Do you think IBM or Goldman Sachs employees are allowed to dissent internally? I'd much rather work in an environment that's somewhat open than one that is completely closed.


Is your argument Facebook allows more dialogue then Goldman or IBM so this is okay?


Perfect example of what Chomsky was talking about, and the former cult member above who sincerely believes they were in a free speech zone demonstrates how effective this management technique is.


Basically saying https://en.wikipedia.org/wiki/Wedge_issue with more words :)


This seems to assume that public debate is masterminded according to a particular design


Debate in a corporate environment is, in fact, masterminded. In these cases, its often specifically encouraged, in a certain fashion. The venues for the debate are built and moderated for that purpose, backed by the policies of allowed conduct of the employer.


An interesting comment from comments on TheVerge:

It’s probably worth bearing in mind that in any company that pushes this kind of ‘open’ communication, there’s an unavoidable pressure on most ordinary employees to say the ‘right’ things. A company that has so much of its internal correspondence open and visible to anyone will very quickly descend into 1984 territory. So those ‘dense’ folk bleating about integrity are likely to really be saying ‘I would never leak, boss, you can trust me’. And in all likelihood, the actual leaker is one of those voices. Personally, I find it baffling that so many supposedly intelligent people see an office under the Eye of Sauron as a Good Place to Work.


>"Boz fostered this culture by example, publishing internal memos critical of the company. "

I read his memo titled "The Ugly." There was nothing in there that was critical of "the company." The only criticality I read was this individual being critical of people who might be prone to self-reflection. Judging from the memo and other employees characterizations of him "Boz" just sounds like a total asshole.


Yes, it does seem like it. However, I would leave the door open a crack for irony. This is hard to judge out of context, but he has to have anticipated (encouraged?) pushback.


Does the memo really come across to you as "critical" or "questioning" anything?

> That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends.

> I know a lot of people don’t want to hear this. Most of us have the luxury of working in the warm glow of building products consumers love. But make no mistake, growth tactics are how we got here. If you joined the company because it is doing great work, that’s why we get to do that great work.

I understand what you and others are trying to say. That he somehow disagreed with his own words, and wrote it to start an internal debate.

But is the reader supposed to know that from some kind of context we can't see? From the words themselves, it seems pretty clear he fully supports the continuation of, in his own words, "questionable" practices.


He said that he didn’t believe his own words. I just don’t believe him. He literally tells employees all the dodgy things they are doing is justified to help them “connect” more people. In no way do I believe he didn’t believe that.

Not only that, but he told his employees what they are doing is totally justified and to keep doing it, because it was sanctioned by management.


This is how I would communicate and do communicate when people are making morally borderline choices.

The most common reaction for people is to ignore the moral implications, a la wall street "We are unlocking value".

If boz went 180 degrees and said "welp, thats it growth is over, we have a major disaster in a few years" - the GREATER force would murder him, namely shareholders.

Even now, Facebooks greatest pain is coming from the hit to its wealth, not to the number of uninstalls coming via "deletefacebook".

At this scale and size for a large top tier tech company, the man in charge is expected to not rock the boat. Any course correction occurs slowly, or through crisis.

Apparently we are doing it by crisis


Please don't try and spin the term "don't rock the boat." A company that brutally cracks down on leakers and has employees en masse calling them "people without integrity" is the epitome of that mentality.

It's great they feel they created a microcosm of openness within the company, but that doesn't seem to have made it act any more morally when it comes to protecting user privacy.


Scary, it seems like they use Newspeak internally.


>Boz fostered this culture by example, publishing internal memos critical of the company. In this case, questioning whether the company's driving mission was the universal good that leadership thought it was when it was adopted.

“We connect people. Period. That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it”

This is not dissent, this is not 'rocking the boat', this is not being critical of the company. He's taking their mission to the extreme worst case scenario and saying that even then the mission is justified. It's the polar opposite of the things you assert it is.

FB is covering by saying no one agreed with it and it was only there to provoke discussion. Why delete the post and its discussion then? It's obvious the discussion on the post wasn't critical enough to actually provide cover for these excuses so they burnt the post and are now lying about it.


"Why delete the post and its discussion then?"

Because the post and the discussion was leaked publicly.


Question remains the same. So what?


You and all the other former Facebook employees sound like people who are working hard to defend the money you made there. Because the dirty looks members of the general public now give those who made their money from Facebook probably gets to you.

Unless you were there since 2005, Boz was a higher up with more seniority, so the original memo was more of a put up or shut up piece than an RFC.


> so the original memo was more of a put up or shut up piece than an RFC.

That couldn't be further from the truth. Facebook's internal communications happen almost 100% exclusively through Facebook itself, meaning this "memo" was most likely a Facebook post, complete with liking, reacting, and commenting capability from anyone in the company.

Buzzfeed touched on this in their version of the story:

> One former employee who spoke with BuzzFeed News noted that they remembered the post and the blowback it received from some workers at the time. “It was one of [Bosworth’s] least popular and most controversial posts,” the ex-employee said. “There are people that are probably still not in his fan club because of his view.”

Source: https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-face...


I think the issue is that Facebook hasn’t done anything to curb bullying, terrorism, fake news or electioneering. They allowed CA to abscond with user data and they took Russian money to target voters. If they can’t track their ads or comply with federal election laws, that’s a big problem.


> I was a Facebook employee for several years; I left shortly before this memo was drafted. The environment was pretty much the opposite of "don't rock the boat." Dissenting opinions were encouraged and openly discussed, but everyone understood that could only happen if it didn't undermine the PR department's job.

It likely depends on where you are in the company. I haven't worked at FB but have a few friends who have in the past. One of them told me something pretty similar to what you experienced, the other had the exact opposite.

So I'm honestly not sure what to believe about their culture. It seems, once a company gets big enough, that culture becomes multiple sub cultures and I'm not sure there is a single culture that drives the company anymore.


It's also a question of the importance of the issue that you're "rocking the boat" about. It might be completely OK to argue with your manager about how the UI for a particular feature should look, but not OK at all to question the ethics of the company's mission, or the ways in which it makes money.


Arguing that internal company discussions should be all public is like arguing that people should have no privacy. That's a slippery slope.

For example, how would you feel if you told your spouse about the grudges you have with your friends or work colleagues, then (s)he goes out and tells on you? That wouldn't be a very happy marriage imo.

In society people that spread gossip are marginalized by those that hate gossip, because we have private affairs that we'd like to keep private. It's a natural phenomenon.

And yes, I see the irony of defending Facebook by invoking privacy. I try not to have double standards.


This is a company that wants to be the default medium for private communications among friends and loved ones, yet deliberately makes security settings opaque and actively encourages oversharing. Anyone who doesn't see that analogy -- especially those employed at Facebook -- are already riding down the slippery slope.


You've hit the key point from my POV.

I have a hard time having empathy for Facebook in this situation when their entire approach to user's information has been incredibly disrespectful. Constant TOS changes. Misleading privacy settings. Opt-out rather than opt-in sharing. Dark patterns designed to serve the company rather than the user ... and straight up bad ideas. I deleted my account right after they did the TOS change that Cambridge Analytica took advantage of. (The one where your friends choices would share your information. That was a transparently dumb idea from the get go.)


What did leaking this memo actually accomplish? If you are someone who wants tech companies to be accountable for their impact on society you should welcome these internal conversations. All that the leak will do is make them less likely.


Conversations where bosses tell their employees we should connect more people and grow the network no matter the cost? You're right we should have conversations about this, but publicly.

Facebook surrendered it's right to discuss such things privately when it's willfully kept lax policies on sharing users data. Stuff like this should leak earlier so we can talk about it before, rather than after, awful things happen.


I’m far from the biggest fan of Facebook, but I’m absolutely a fan of playing devils advocate in an organization if for no other reason than to solicit reactions and get people engaged. As someone who will use this device sparingly when appropriate, that’s really what this post looked like to me (as opposed to someone who was in it to get terrorists signed up to fb... really?). I honestly feel sorry for the guy


On the other hand, have we really gotten to the point where we have to try to provoke others into a debate? Why can't we state what we mean, what we think, what we're uncertain about, what questions we'd like to discuss in order to foster discussion instead of provoking it. Playing devil's advocate is fine when it's understood what's going on and why you're playing devil's advocate, but when there's ambiguity you play this game of "yes I said that I didn't mean it though" which ends up sounding weak as it does in those case. Devil's advocate is a great cognitive strategy for exploring an issue together, but it's a very poor conversational strategy.


No, I don't see anything that says provoking others into a debate is the only means of conversation, just one possible way of prompting a discussion. I imagine a straightforward discussion as your described is the norm, and this could be one case where they were provocative and so was selected to be leaked. But I agree with your second point that this does not appear to be such a case.


You don't inspire this sort of debate by putting up a straw man.

You inspire this sort of debate by thought exercise and ask about actual application - you couch the conversation to direct your staff to stronger ethics.

If this conversation were at Uber, in their self driving car division the consequences of this would be human life. The way to have that conversation, with context would be to couch it in the "trolly problem" - because that would keep the framing.

Ethics, the word is ethics - Facebook is clearly lacking them. Were "dumb fucks" according to FB's chief - and the fish rots from the head down.

And the staff's response "find the leakers" -- funny how many groups of people I find despicable seem to chant this.


It’s really not a good idea to play the devils advocate as a high ranking individual in a company without being super extra explicitly clear about that. People might mistake it as the companies position, especially if no other high ranking individual contradicts or clarifies he companies position.


I do believe he meant it when he was saying unethical behaviors and negative effects on society were worth the greater good of connectivity.


When I play devil's advocate I clearly state what I'm doing up front. This smells like an attempt at rewriting history.


You are mistaken. At Facebook, the devils advocate would argue in favor of government regulation.


Memos like this allow for open discourse within the company. Leaks only encourage companies to be even more closed off. Facebook could easily hide their language in corporate speak if they really want to encourage people to drink the corporate Kool Aid.


That's not what the memo was, it was not a case where "bosses tell their employees what they should do".

It was a case of starting a debate by voicing an extreme opinion.


No it wasn't.

The memo is exactly and clearly telling employees what to think and do. No questions allowed.


Companies sometimes drink so much of their own kool-aid that they lose all perspective on what's actually important. Shining a light on conversations like this one can be an ego-check, where people who don't work at Facebook can say, "hey ... wait a minute here...".

It's an "emperor's new clothes" situation where we all get to play the role of the child.


The thesis of the memo couldn't be written any other way?

I think you could write about the ideas contained in Boz's memo in such a way that if the memo leaked you still wouldn't look like huge bleeps.

It's not the conversations that are getting them into trouble. If it were just this memo, then nobody would care.

Their action are getting them into trouble. Leaking memos like this merely offers a window into their souls.


> It's not the conversations that are getting them into trouble.

I find these surreal cult-like conversations a lot more off-putting than Facebook's data practices. Those I can understand, these conversations (and the words of these well meaning employees more than those of big bad Boz) make me feel like I need to take a shower. To me, this shows the very worst of intellectually dishonest to the point of delusion, modern day North American culture, and it disgusts me.


I was with you until you specifically criticized North American culture. What makes you think that German or Chinese companies don't also push their employees to place the company's success above ethical considerations?


Of course it could have been written in the style of a press release or perhaps reduced to a politician-style soundbite. But although pablum is harmless when leaked, it doesn't have the nuance needed to give real direction to smart and powerful knowledge workers. It is also bland and may be regarded by thoughtful workers as insincere.


This didn't have nuance. If the guy isn't lying, he was throwing a bomb to get people to react; if he is lying, he was floating the worst let-us-do-evil-that-good-may-come company line I've personally ever seen. In neither case was this a nuanced statement!


It was much more nuanced than the headlines such as the one used by BuzzFeed in their original story [https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-face...]. He was trying to start a meaningful conversation, which is basically impossible to do under the constraint that you avoid giving adversaries any way to take your remarks out of context and spin them to manufacture outrage.


Well said. If I received internal memos which have gone through an external PR filter, the message would probably read like any other generic press release, and engender gossiping and finding hidden meanings in the memo.

Perhaps it could have been worded differently (better?), but I did appreciate the solid direction that was given by the memo. All too often, leadership is unable to give clear guidance because they are too wishy-washy about what the goals actually are, perhaps not even knowing what the goals should be besides making money.


>What did leaking this memo actually accomplish?

It showed that FB says one thing in public and then does the opposite in private.


There is the "WikiLeaks justification": leaking this memo will force Facebook to have more vigorous internal controls for locking down information, making them less efficient and hastening their downfall.


Wouldn't hiring people with integrity mean hiring executives that write memos that they actually agree with ?

>Bosworth distanced himself from the memo, saying in a Twitter post that he hadn’t agreed with those words even when he wrote them.

That is scary as shit. That they think the leaker is the one without integrity and not the executive team.

What terrible people.


>What terrible people.

The language was kind of funny to me even. Hunting down the leakers to make Facebook great again... the company sounds like the business version of the white house administration. If it's this difficult and requires this much secrecy to convince yourself that what you're doing isn't evil then maybe something is very wrong on a foundational level.


> he hadn’t agreed with those words even when he wrote them

I'm struggling to see why anybody thinks this is a reasonable defence.

There is no indication he didn't believe it. The company's behaviour is consistent with it. He only said "I didn't mean it, it was to stimulate debate", and the classic "You're missing context" (which I am not able to show, of course) after drawing negative PR.

It seems very generous to me to give his recent tweet much credibility at all.


>Wrote another: “This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty.”

Says the company who's very moral compass is coming into question.

Nazi soldiers following orders to line up minorities in slums and shoot them, or herd them into cattle cars - loyal, yes? Moral? No.

"Loyalty," in that post, to what? To Mark and the investors' bottom line, and tangentially with it the bottom line of you and your fellow employees? Loyalty to this idea of "connecting the world?" Is that really the value of Facebook?

It sounds like a cargo cult.


The stuff being reported around Facebook is certainly cause for serious concern. "Cult"[0] has a pretty contentious meaning which is problematic when applied to Facebook without further analysis—it can certainly be separated from issues of loyalty and morality. "Cargo cult"[1] really doesn't apply at all, at least in the context of what you're describing here.

[0]: https://en.wikipedia.org/wiki/Cult

[1]: https://en.wikipedia.org/wiki/Cargo_cult#Metaphorical_uses_o...


Thanks, you're right. I've been using the word wrong for a while.


And how do they "connect" people? Maybe I use Facebook wrong, but I so nothing much more than vanilla posts I have no interest in from the same people day after day. There is literally nothing I know of in the platform that would cause me to spontaneously meet someone new, unlike forums, meetup.com, etc.


> And how do they "connect" people?

Don't forget - corporations are people too. I think Facebook's mission might be to connect people with advertisers.


Established organizations are the wrong place to effect change by employees. They are not democratic, they may love to give the impression to employees and the media but there is always a hierarchy and a powerful inner group that makes the decisions.

Debate is ok but anything real that threatens the powerbase will be quickly dispatched. The only person who can change Facebook is Zuckerburg and his inner cotorie or strict regulation. But this is not a problem limited to Facebook. Google is worse and there are others like Palantir and a pipeline of companies who would like to take their place.

Ethical behavior from individuals will only have an effect in smaller companies and early startups. When they do not get engineers who agree to unethical practices and when there is pushback they quickly realise they may need to rethink.

But software folks have postured on freedom and liberty endlessly but gone ahead and build some of the creepiest stalking infrastructure ever built without a care for fundamental human values or ethics and thus are not trusted anymore.


No?

People with integrity have done this all the time, people with integrity are standing up and bringing these issues up constantly.

But frankly normal people cant be arsed to give up free services like google and facebook because they dont, cant, and wont afford the costs of those services at full.

Further - I am a dyed in the wool non facebook user who was warning about this from the day it was created.

But Facebook employees are correct in what they say.

They genuinely believe that they must be a force for good, and that their websites will bring people together.

It is the MOST essential thing for these people to be able to talk to each other candidly and clearly while they still believe in doing good and being ethical.

Because once that goes, the ability to say uncomfortable things, the only other option is to become the corporate behemoths that all SV-ites hate. To become a suit.

I can't understand how people on HN are missing this.

Facebook has been regularly an enabler - but for all those years HN has been cool with it.

Now, when the shoe has dropped, people here are displaying the same overreach and lack of nuance that created this scenario in the first place.

Facebook is the least of all evils. People are ALWAYS going to create this miserable form of social networking because its easier and matches human neuro patterns closely enough.

But this is the one time we will have a single institution which is not yet culturally made up of suits, who can institute or make the effort to fail correctly.

Facebook internally discussing this and realizing that there is no hope is more critical than people tearing facebook down.

Having a clear idea of objective reality, of being able to see our actual options as both employees at facebook, and as users of facebook (or friends with facebookers), is our best way forward.


Be careful what you ask for FB'ers if fb etc get considered CNI your probley going to have to have security clearance if you access to sensitive data - which is going to suck even more so if your origionaly from outside of the states.

I know that some Team leaders at one Uk telco I worked at where asked to go through DV clearance - That's TS (drug tests polygraph) in USA terms.


Most companies have held internal memos like this as private to the organization. Any breach of that is a firing offense. FB, Google, Apple, etc. are not doing anything that corporate America hasn't been doing for ... for I don't know how long.

It is disconcerting how FB employees have come out in support of these ideas from the memo, though.


I think big tech has it very different, given how much they know about their personal employees lives. Some information should leak within a healthy society so backlash and corrections can occur before an election might be compromised by a horrific breach.

For a view on what's different an excerpt from a Guardian piece is below.

---

“It’s horrifying how much they know,” he told the Guardian, on the condition of anonymity. “You go into Facebook and it has this warm, fuzzy feeling of ‘we’re changing the world’ and ‘we care about things’. But you get on their bad side and all of a sudden you are face to face with [Facebook CEO] Mark Zuckerberg’s secret police.”

The public image of Silicon Valley’s tech giants is all colourful bicycles, ping-pong tables, beanbags and free food, but behind the cartoonish facade is a ruthless code of secrecy. They rely on a combination of Kool-Aid, digital and physical surveillance, legal threats and restricted stock units to prevent and detect intellectual property theft and other criminal activity. However, those same tools are also used to catch employees and contractors who talk publicly, even if it’s about their working conditions, misconduct or cultural challenges within the company.

https://www.theguardian.com/technology/2018/mar/16/silicon-v...


One of the important qualities of a great place to work is that decisions are made based on sober analysis rather than reflexive corrections. Maybe you're right, and Facebook is just too important to allow that, but I think it's fair for Facebook employees to be unhappy about this.


Do you honestly believe you can’t have both? I’m not saying all data should be public, what I am saying is if what you’re discussing is morally repugnant given your past history it shouldn’t surprise you if it gets leaked.


The problem is that lots of people find lots of different things morally repugnant. If it's normalized for people to leak things that they consider morally repugnant, that means there are serious costs to engaging in any controversial discussion. There's no way I'm going to talk about diversity at my company if I think someone might go tell a reporter what I said; there's no viewpoint on the issue which isn't offensive to someone.


I think we’ve all blocked out the scary part of Pinocchio, where the boys go to the carnival (Pleasure Island). Spoilers: it ends with them being treated as literal livestock.


Leakers are a HUGE problem, especially when they reveal that the emperor has no clothes. (hello Googlers and FB-ers)

BUT, I've the read the memo and as I saw it, he says that getting people connected can mean that bad guys will also be connected. But that is life. Terrorist and child molesters will use a smartphone too...but


The employee reaction is natural. They feel their privacy has been violated


The data driven espionage companies should be neutered. Ad's are a joke, that's not how they make their monies.


This one is my favourite:

"This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty."

Note how the focus is not on the morality of honouring users privacy, but on the "morality" of protecting the company. Collecting personal information about billions of people, using "questionable" practices, then selling access to it. Its a "good job". One that the commenter does not want to lose. Understood.

However there are laws to protect companies against employees who leak secrets. Employees sign nondisclosure agreements. Companies can adopt no tolerance policies on leaking to the media. They can terminate employees who violate them. No employee needs to consult a moral compass; the rules are clear. Break them and there can be grave consequences.

On the contrary, there are no equivalent remedies available to users whose privacy has been entrusted to Facebook. There is nothing to keep FB honest. There are no grave consequences for violations of user privacy.

When there is a "leak" of users information, the user is entitled to nothing more than an impersonal apology.

Relative to other businesses, one might go so far as to believe "there are no rules" in the space where FB has operated. Users (who are not the customers of FB) have no recourse; theres nowhere else they can go. Buy.

In all seriousness, it is the user who must hope that every FB employee has a "moral compass". Whether FB employees can trust each other is not what the user wants to know. The user wants to know if she can trust FB's employees.


I found that part wow-worthy too. I wonder if they thought whistle-blowers or leakers like Manning or Snowden have "moral compass" and "integrity", or if they're traitors without "loyalty".


Funny, didn’t Zuckerberg say that “Privacy is dead”? I guess only for users of his product.


Not a surprise really.

Many jobs considered "good" require a person to forget about ethics.

Tobacco, oil, pharma, agro, car, ads industries. Now we can also add "social web" to this list.

A person willing to suspend morals in exchange of money is the one you should not trust. Especially if they cannot be held accountable for their actions.

Because hell knows what ELSE they are ready to do.

And they will fight to defend their source of income = "loyalty" = "protecting questionable practices" = "indulging in mafia-like behaviors".



It's kinda ironic that a post about "All we are doing is connecting people and information" gets deleted because it gets connected to a lot of people ("leaked").

You can't simultaneously hold the opinion that there's no harm in sharing information, while also holding the opinion that there is such thing as a "leak." It just strikes me as so ironic for a company that champions "privacy is dead, live with it," to have to delete its own valid internal debates because of the consequences of lack of privacy (i.e. leak = lack of privacy).


You may find it additionally ironic that about a month before the Boz memo, it was reported[1] that Zuck bought 4 houses located around his own for $30 million.

Privacy is looking pretty alive in that neighborhood.

[1] http://time.com/money/4346766/mark-zuckerberg-houses/


Zuckerberg built a 6 foot wall around his 700-acre estate in Hawaii because he values his own privacy that much: http://www.newsweek.com/facebook-mark-zuckerberg-wall-hawaii...


If it wasn't for the unearned and unnassailable value of the network effect Facebook benefits from, this disgusting behavior (from Zuck and his minions) would be enough to drive everyone to a new platform.


"The more secretive or unjust an organization is, the more leaks induce fear and paranoia in its leadership and planning coterie."

"This must result in minimization of efficient internal communications mechanisms (an increase in cognitive "secrecy tax") and consequent system-wide cognitive decline resulting in decreased ability to hold onto power as the environment demands adaption."

"Hence in a world where leaking is easy, secretive or unjust systems are hit nonlinearly relative to open, just systems. Since unjust systems, by their nature induce opponents, and in many places barely have the upper hand, mass leaking leaves them exquisitely vulnerable to those who seek to replace them with more open forms of governance."

-- Julian Assange

http://cryptome.org/0002/ja-conspiracies.pdf


Yeah I feel like secretive and unjust are two completely different things, and Julian Assange conflating the two doesn't reflect well on his already questionable character.


Systems that are just do not generally require secrecy to the degree that unjust systems do, precisely due to the fact that when publicized, people will generally agree with just processes.


Except that even a just system has parts that can be taken out of context, and twisted to paint a narrative that it is not just.


There are a whole bunch of examples of this, though "hide the decline" sticks out to me.

It's also the justification for keeping courts open to public attendance but barring recording equipment (in Canada).


That's why I said "to the same degree", i.e. privacy is important, but you don't have to have the massive opacity required for large-scale unfairness to persist.


> You can't simultaneously hold the opinion that there's no harm in sharing information, while also holding the opinion that there is such thing as a "leak.

Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post. In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment. Maybe that would lead to more collaboration on solutions, which is necessary because there are actually some tricky tradeoffs here. But that doesn't give the same dopamine hit as cutting down the tall poppies, right?


I guess I don't understand your point. But I'll elaborate because I'm interested in this topic.

My point is that Facebook's ethos, that a post-privacy era can exist and be okay, is betrayed by how they clamor when it's their own privacy. It makes it feel like it's a one-directional relationship.

Perhaps you're suggesting facebook's ethos is actually that a privacy middle-ground can exist, where people can choose what gets shown where and to whom. I can't disprove that that's their ideal, but in light of their data-sharing I have no reason to believe it.

Aside from ideals, we can point out the consequences of Facebook in practice. Facebook is a written medium that preserves everything (even something from 2 years ago) and thus has constructed a system that forces its users to hyper-curate their entire public persona or suffer social consequences, and from a practical perspective their own VP failed to curate sufficiently. So regardless of ideals, if the system punishes discussion then I see that as a problem, as well as an irony when it happens to their own VP.


> Perhaps you're suggesting facebook's ethos is actually that a privacy middle-ground can exist, where people can choose what gets shown where and to whom.

They can, to a larger degree than most critics seem to realize. I use those controls all the time. I might agree that they're not as prominent or easy to use as they should be, but they exist because people at Facebook cared enough to implement them (which isn't easy or cheap at that scale BTW).

> I can't disprove that that's their ideal, but in light of their data-sharing I have no reason to believe it.

Oh, you mean the data sharing that was curtailed sharply in 2014, and again in increments ever since? Is that continuing effort and foregone revenue "no reason" to believe such a sentiment exists?

> if the system punishes discussion then I see that as a problem

Yes, punishing discussion is a problem. Wait, what is this entire thread, and a thousand like it all spawned by the publication of these posts, about?


>> Oh, you mean the data sharing that was curtailed sharply in 2014, and again in increments ever since?

Don't pretend they're taking care of it on their own. They're reading text messages in 2018.

>> Yes, punishing discussion is a problem. Wait, what is this entire thread, and a thousand like it all spawned by the publication of these posts, about?

Wait... Are you arguing that it's not bad that facebook stifles controversial opinions on its platform because its behavior creates controversial discussions on other websites...?


> Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post.

There's a power asymmetry here that makes the individual user vulnerable, and that power asymmetry should be countered by demanding transparency.

There's an obvious parallel here between individual citizens and government apparatus.

Those that control infrastructure and institutions shouldn't be enabled to abuse that power. And if they do they shouldn't be surprised when the affected protest!


>> Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post.

This argument is illogical, because Facebook forces everyone to sign its ToS to use its services, while nobody forces a Facebook employee to leak internal stuff. Said another way, whether or not I wish to have control over my FB data, FB coerces me to agree that it can do whatever it wants with my data. Its not exactly opt-in, is it? Its far worse, of course, if you consider shadow profiles, because it is even coercing people who didn't even explicitly sign up to the ToS. Unless the leak happened via some kind of coercion (which doesn't seem to be the case), your comment is incorrect.

>> In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment.

What? You mean you care about something, but you just won't do something about it, nor openly tell anyone why you wouldn't do something about it, or even talk about it before the issue blows up? Yep, totally convincing.

>> Maybe that would lead to more collaboration on solutions,

Why do people need to "collaborate" on solutions? What do they get from it? Is Facebook going to pay people a share of the profits? If Facebook is a corporate entity which serves its self-interest against people's self-interest (which they have clearly been doing for a long time), what kind of idiot would suggest the people whose self-interest has been affected should now "come to the table" so "we can all work something out"?

>> which is necessary because there are actually some tricky tradeoffs here.

The only tricky tradeoff here is: should Mark Zuckerberg be the only one who should go to jail, or should the entire company be rounded up? It is quite tricky, I do agree.

>> But that doesn't give the same dopamine hit as cutting down the tall poppies, right?

I don't know about tall poppies, but "culling" the "weeds" is the only way to have a healthy garden.


> Its not exactly opt-in, is it?

You're free not to use it. If that opt-in isn't enough, exactly how many levels do you want? If you do choose to use a free service, whether it's Facebook or a public library, you have to consider how it's paid for. Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.

> You mean you care about something, but you just won't do something about it

You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.

> Why do people need to "collaborate" on solutions? What do they get from it?

Ummm ... the solutions, which are not only applicable to Facebook? This is a general problem faced by many companies. The solutions could also be useful to the people who blather about creating a distributed alternative to Facebook. I've been a member of the decentralization and distributed-system community for far longer than Facebook or Y Combinator have existed. I also know something about the scale and connectedness of the data at Facebook. We're multiple basic innovations away from being able to create such an alternative. Wouldn't it be nice if people who actually understand various parts of this can talk and work together? That doesn't become more likely when every discussion is filled with people who only read others' comments enough to find where to insert their own half-baked opinions or insults.


>> If that opt-in isn't enough, exactly how many levels do you want?

Since you can't seem to count to 2, how about:

1. You let us share your data with others in return for free service

2. You don't let us share your date in return for paid service

>> If you do choose to use a free service, whether it's Facebook or a public library

Well, a public library is tax funded and people outside the library employees have a big say in its inner workings. So you can't get your comparisons correct either.

>> Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.

Perhaps you should complete the thought, because I don't actively use the something

>> You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.

Really, as opposed to your very realistic expectations that everyone should just trust FB employees would have "done the right thing" had they not been caught red-handed? Oh right, because FB knows better what is best for everyone else.

>>Wouldn't it be nice if people who actually understand various parts of this can talk and work together?

This is truly bizarre. So if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt? Let us say you think, "oh, but it might take much longer". Does that automatically adversely affect people more than the damages that can be caused to society via rampant data collection? How can you be so sure? Oh wait, because you must be smarter than everyone else, as you got through the interview.

And finally, it is interesting all the things that you selectively left unsaid (exactly like other FB employees have been doing all the while).

- you don't have the courage (what an ironic handle) to discuss shadow profiles

- you never actually addressed the fact that no one from outside coerced the leak, which made your first comment more rhetorical than substantial

- you cleverly twisted the "collaboration" to be amongst FB employees when clearly the line following tells that you actually meant collaboration between FB employees and its users (dopamine hit for whom, that is? so you are now assuming others cannot read either?)


We've banned this account for violating the site guidelines.

https://news.ycombinator.com/newsguidelines.html


Bystander here. Why the ban? It’s snarky in places for sure but I’d say it’s a pretty solid set of points and counter points. It definitely “added something” to my experience reading this thread.


I suggest that you also identify the primary account behind it and give them a reminder too, or else they'll just keep doing it over and over again until their targets run out of patience.


> 2. You don't let us share your date in return for paid service

Personally, I think that might be a good option, but you can't claim to have made it explicit before so your "count to 2" insult is misplaced. I know that the only thing you've ever done since your account was created is bash Facebook (how nice that anyone can check that for themselves BTW), but even in that light such childishness is counterproductive.

> if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt

Total strawman. Nobody said or implied that. There's plenty of knowledge and innovation everywhere, but the amount that can come from Facebook only has to be non-zero to support my point. Several hundred developers who have collectively worked on almost every distributed system you've ever heard of might have an idea or two worth discussing. They might even have a perspective on scaling issues that's highly relevant to the problem at hand but not widely known outside of Facebook and maybe three other companies. Why do you try so hard to throw cold water on any such conversations?


[flagged]


You were teetering on incivility earlier in the thread and here you fell straight into it. Please don't! Instead, please read https://news.ycombinator.com/newsguidelines.html and follow the rules regardless of how badly anyone else is behaving.


I think we all understand this scenario can lead us to highly charged emotions.

I believe you comment would have been better without name-calling and leap to jail time.


“This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty.”

The sheer lack of self-awareness in that statement.


It's breathtaking isn't it? Facebook sells personal information to advertisers. And this employee worries there might be 'smart people who lack a moral compass' working there? It would be funny if it wasn't also terrifying that Facebook employees believe the company is a force for good in the world. Working for Facebook is about as moral as working for Big Tobacco: there's plenty of evidence the product is actively harmful.


Where can I purchase some Facebook personal information? I'd love to use it for some custom audiences on my ads. /s

As far as I'm aware, Facebook has never sold personal information to advertisers[0], because that would be giving away its crown jewels. It might leak it, but doesn't hand it out for sale.

EDIT: Downvoters, please point out where I'm wrong. It looks bad for the community to suppress factual statements.

0. https://www.facebook.com/help/152637448140583


In every debate about "[Advertising company] sells personal information" I've seen, this is always an issue. I suggest we instead say what it is: They monetize personal information.


Facebook sells ad targeting based on gender, sexual orientation, marital status, age, location, salary, education, musical taste, purchasing behaviour, device usage, interests, hobbies and the list goes on. Without 2B users' worth of personal information, there isn't anything to sell. For Facebook to say 'they don't sell your information' is Orwellian corporate doublespeak. Advertisers pay them money and in exchange, Facebook allows them to target sets of users based on those users personal information. How is this not 'selling information about you'?


Are you confused by the distinction between selling personal information and selling ads? The latter does not involve any exchange of personal information to the purchaser.


It seems you are both correct, and the disagreement lies on the meaning 'personal':

- They use 'personal' as 'private', or 'information about you'. e.g. I have a fetish, I wish no one to know, it's 'personal'.

- You seem to use 'personal' as 'personally identifiable information'. e.g. if I have this data, I can know trace back to its originator.

The ambiguity can be found in many places around. Most people don't make the distinction. Rather, they think anything private is personal. At the same time, the definition you use can be found in official documents, like GDPR, the new EU Privacy Law (https://gdpr-info.eu/art-4-gdpr/).


Thanks for clarifying this: personal information does not mean personally identifiable information. Personal information can be sold without uniquely identifying individuals and this is exactly what Facebook's core business model is.


So to reword your question:

Facebook allows clients to purchase personal information. What specific personal information is purchased by Facebook advertisers?


It hasn't sold the information, but until 2014 Facebook was effectively giving it away to attract developers on its platform


I think the CA scandal shows they sell access to gather it yourself, which is a fine distinction that doesnt matter in practice.


CA used data from Facebook's Graph API. They don't sell access to their API, it's freely accessible. It's disadvantageous for Facebook to not keep your personal info in their walled garden of advertising, where they can use it as a competitive advantage for better targeted ads.


You're right. Facebook has not sold personal information to advertisers. That's factually false. What I think we should say instead, is that Facebook sold the notion that 'they knew about you and your preferences' to help advertisers.


I think he means indirectly sell, which is kinda true. By letting the CA scandal happen, they seriously sold a lot of data. Not what they really wanted to do, but their reckless policies and vague morality standard caused that.

Also, nothing changes the fact that Facebook is clearly harmful to the society in important ways, according to various researches. And they do not care about this fundamental issue, they just want more users and more profit.


> By letting the CA scandal happen, they seriously sold a lot of data.

The CA scandal did not involve any sale between CA and Facebook. CA sold insights gleaned from data pulled from Facebook's public and free-of-charge Graph API.

> Also, nothing changes the fact that Facebook is clearly harmful to the society in important ways, according to various researches.

I'm not sure how that's relevant here. Arguing that Facebook is a bad company does not make misinformation more correct.


It's relevant because we're discussing a Facebook employee that wants to screen hires for integrity. Given widely known scientific evidence that FB causes anxiety and lowers self-esteem, which is a direct consequence of its core business model: integrity is already in question when you accept the interview.


I feel like these are weasel words, and I could use the same argument about anything. I could say Clinton let the email scandal happen. Or that Clinton indirectly leaked classified emails, which is kinda true. Or someone can say Clinton is clearly harmful to society in important ways, according to various sources. Clinton just wants more votes and power. I think we should stick with the facts to make a strong argument against Facebook.


Yeah why would FB sell that info when it's the key to their targeted advertising?


Come on, are you serious? 2B people use it every month to keep in touch with their friends. I'm typing this in Dubai, and I use FB/Messenger daily to keep in touch with people in Budapest, London, Amsterdam, Berlin, etc.

Disclaimer: I worked at FB.


The road to hell is paved with good intentions, innit?


The kinds of employee responses i see makes me think of an intelligence agency. Talk of loyalty and fear of foreign spy infiltration. And quite frankly, facebook is one. Except none of the employees are "Secret" or "Top Secret" cleared AFAICT.

I'm speculating here but it may even possible that they have a greater global reach than the NSA as 2.2 Billion people are in their system using their apps on all their devices from watches to desktops. They record all the personal data, track every website being visited, possibly hot mic phones, and record people's locations. Remember those russian soldiers posting photos from inside the ukraine? Yeah, facebook has that data and intelligence agencies of the cold war era would have drooled for this data.


This is why self regulation doesn't work. People posture endlessly and when given half a chance with any sort of power and real stakes, ethics gives way and we are left with apologism, hand waving, denial and rationalization.

This is easy to fix. Follow the money and change the incentives. And the fix will allow software folks to continue to posture endlessly as they will not have the power to make a decision.

1. Ban micro targeting by advertisers, only contextual text and location can be used.

2. Platforms like Facebook, Google and their chain of contractors, partners etc cannot offer micro targeting.

3. Ban data aggregation with heavy penalties and imprisonment for companies, marketing individuals and their agencies who try this. This will solve these problems in a jiffy.

That removes the incentive for stalking people 24/7, collecting tons of data and making correlation and inferences.

The thing is so many people, especially governments and political players, crave this power that it cannot be done anymore, that's why building surveillance infrastructure is a bad idea. It will always be abused and why the individual ethics of the engineers and society at large becomes critical but we have already failed that test.


I'm surprised by the sheer amount of blind loyalty at Facebook.

I mean, I consider myself to be a loyal employee but I'm not blind to ethical violations. The way these employees are defending a global multi-billion dollar organization it's almost like they were executives. They'd rather sell out the rest of the world for what? To be a FB engineer until they retire? It's like Facebook indoctrinates it's employees somehow.

I can't think of a single non-managerial employee at my company that wouldn't speak up if we deliberately started violating agreements with our customers and coming closer and closer to breaking the law, and I'm comfortable with that.


These employees are people who are pulled straight from college, given salaries higher than most of their contemporaries ever dreamed of earning, and are told they are special and are changing the world. Why wouldn't they be blindly loyal to the sociopathic machine they helped create?


The vast majority of these people signed onto what they believed was a sure tradeoff:

They knew FB wasn't the most ethical company around, and that they at times would have to pinch their nose.

In return they would be handsomely financially rewarded.

What we are seeing now is not the response of cult members - it's the response of lots of pragmatic employees/"investors" being protective and worried about their investment. In the worst case they would have irrevocably stained their personal reputations for a relatively small gain.

So to repeat, no - this is not a cult - the employees knew what they were getting themselves into. We should not feel sorry for them.


As a former FB employee, I say "bullshit".

I didn't go to FB because I thought it's unethical but "money". I went there because I wanted to work on a product that I and every one of my friends/family uses, that helped me a lot when I was getting divorced, etc. Money was OK, but you can make more money than at FB, eg. in Finance, or other special places.

I've never worked at a company that takes data protection as seriously as FB, or has the caliber of people protecting data.

Also impressive was that _every_ week there was a 1-hour QA where any intern/employee can ask Zuck a question (it's open mic). In my time there I've never seen Zuck really duck a question. He stands behind what his company does and believes in the mission, as do most employees.

"it's the response of lots of pragmatic employees/"investors" being protective and worried about their investment" > sorry, this is just silly...


>> I've never worked at a company that takes data protection as seriously as FB, or has the caliber of people protecting data.

I don't doubt this, but I think Facebook has a flawed culture that allows and encourages employees to use this mindset as a rationalization for unethical things like:

- run emotional experiments on news feeds

- silently logging all Android users' calls and texts

- allow proliferation of fake news

- allow buying of propaganda ads from state actors

- zero safeguarding of data to ensure it wasn't sold by app creators/devs.

Data is central to facebook's business model and the ability to collect, analyze, and sell lots of data (a natural result of the 'big data' hype) became an infatuation for facebook employees.

The Boz memo supports my point - except he cleverly hides it as 'grow at all costs' rather than the underlying 'collect/analyze/sell data at all costs'.


"run emotional experiments on news feeds" > I believe this happened once, for the advancement of science. Personally, given that FB is the only place where such real social science can happen, I wish they'd do it much-much more. I don't think any social scientist can perform an A/B test outside of Facebook.

"silently logging all Android users' calls and texts" > I also don't like this.

"allow proliferation of fake news" > I think the "allow" part is disingenuous. It's not like FB people are able to guess all the bad vectors in advance and have advance alerts set up. Also, remember, 2B people are on FB, so there will be a lot of shit, because that's what people are like. I actually think they reacted pretty quickly, after the first time there was a credible attack.

"allow buying of propaganda ads from state actors" > Not sure what you mean? US elections are okay to use ads, right? You're saying other countries shouldn't, if you don't like the gov't? This is a lose-lose on FB I think, either people like you bitch that they're enabling a bad gov't, or they're seen as a censor. Believe me that a lot of smart ppl are trying to figure out what the "least wrong" thing to do is on things like this.

"zero safeguarding of data to ensure it wasn't sold by app creators/devs" > bullshit, they stopped this is 2015. But I agree, the way it worked before was really broken and asked for this to happen.


>> "zero safeguarding of data to ensure it wasn't sold by app creators/devs"

> bullshit, they stopped this is 2015. But I agree, the way it worked before was really broken and asked for this to happen.

> I've never worked at a company that takes data protection as seriously as FB

Before or after 2015? Because a couple of years does not quite make up for the preceding period starting in 2007 (if I recall correctly) during which FB clearly didn't care.


I worked there in 2016-2017. I was a Data Engineer so I was pretty close to this. It was taken very seriously, to the point it was annoying (tables with PII get anonymized, which then means extra work, etc). Also, the sheer amount of effort that went into this [the tooling/infra that was already there for this when I arrived] was impressive.

I'm not claiming FB couldn't have / can't do a better job, you can always do a better job, hire even more people for this, etc. But it was definitely taken very seriously, much more seriously than you'd think from all this bad press. And if you go and work there, you'll be impressed, I guarantee that.

However, what I'm talking about is data protection, the problem here was that app permissions were explicitly too loose [until 2015]. As I said, I also think this was a bad policy, and people are rightly upset. But there's way too much generalization happening in this thread.


> In my time there I've never seen Zuck really duck a question. He stands behind what his company does and believes in the mission, as do most employees.

Just this week he has ducked questioning by the UK parliament, and opted not to stand behind what his company does. A couple of weeks ago, when this story broke, he opted to just hide for a little bit, issue legal threats against the guardian and the NYT and see if the whole thing would blow over.


He was very clearly talking about the weekly internal QA. The context of this discussion is what it's like to work at Facebook. Try not to be dense.


That doesn't matter, because it still contradicts the statement that "He stands behind what his company does and believes in the mission."


My understanding is FB is still going, it's just that Zuck is sending somebody else. FB is in a lot of countries, he can't go and personally talk to every parliament for PR purposes. He said he is happy to go to the US one. I'd do the same, go to the US one and send others to the rest.


I believe the problem last time was that whomever it was who was doing the answering (FB lawyer?) too often claimed not to have the info being asked for ("I'd have to check and get back to you"). The questioners felt it was an intentional ploy to weasel out of answering uncomfortable questions. Hence the current insistence that Zuck be there himself.


There's no guarantee, in fact I'd suspect it's less likely, that zuck would be able to answer those questions better than a relevant lawyer or more relevant lower-chain director or vp.


It's kind of culty here, but I don't think that's out of the ordinary for silicon valley. There is a lot of internal debate and discussion about things, as well. It's not a hive mind.


From the original post by "Boz"[0]:

> Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And we still connect people

This guy is a VP at Facebook. Words mean things and his words have weight within the company. This alone disgusts me. He could have easily taken the other side of the argument to stir debate and chose not to.

> Leakers, please resign instead of sabotaging the company

I think the level of hubris espoused by these Facebook employees is a much better reason to delete Facebook than anything I've seen so far. In fact, although the data we have gotten is incomplete, it seems to possibly be the general feeling. The focus on growth and profit over any thought of doing the right thing is actually evil, especially when one recognizes that evil is being done.

This company is no longer a small company built out of a dorm room. It is a massive publicly traded company that has revenues and active users in the billions. Despite the current climate, words actually have meaning, especially words greatly amplified on these tools, and these actions have real consequences.

[0] https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-face...


The reason Facebookers hate leakers so much is because it helps no one. The leaker gets to continue working his cushy software job while a bunch of PR people are forced to work overtime to control the situation. Facebook is going to continue to do whatever it wants to do, but they'll just be more secretive about it within the company. Boz could have easily written a memo with corporate-speak if he just wanted everyone to drink the Kool-Aid.


> The reason Facebookers hate leakers so much is because it helps no one.

It helps the general public by giving insight about how crazy is the people that builds and moderate the platform used to communicate by millions of people.


How are they crazy? If anything, I think it's refreshing that a VP acknowledges Facebook's problems. If he never acknowledged it, everything would be peachy. I guess ignorance is bliss for most people.


It helps me understand what is really going on behind closed doors. I think you meant that "it helps no one who works at facebook" a demographic that most of us don't care about at all


It sometimes helps the public, against whom the conspiracy is performed.

Do you think Snowden's leaks were intended to help the NSA?


He does seem like a bit of a bully himself:

http://boz.com/articles/be-kind.html


Not cool. You comment descends a bit into ad hominem territory.

That blog post was basically Boz publicly acknowledging a personal flaw he hadn't been aware of up until his interaction with Dustin. In other words, calling him a mean co-worker in 2008 would be an accurate characterization, if we rely on his recollection of events when Dustin was still Facebook CTO. Calling him a mean co-worker now would be an unfair characterization.

FWIW, Jeff Bezos told a similar story of how he became aware of his meanness to his grandmother in a commencement address in 2010, though in his own case he was a 10 yo boy when he came to that realization. [0].

[0] https://www.princeton.edu/news/2010/05/30/2010-baccalaureate...


From the outside, this all seems rather cultish. Are Facebook employees so convinced of the nobility of the company, in spite of all the evidence to the contrary?

It brings to mind the famous Upton Sinclair quote: "It is difficult to get a man to understand something, when his salary depends upon his not understanding it!"


> Are Facebook employees so convinced of the nobility of the company, in spite of all the evidence to the contrary

At best, we only have evidence that the subset of employees who chose to comment on that internal thread and whose comment was chosen by the journalist to be included in the article feel this way.

One of the things that media and aggregation have really accelerated is the cognitive fallacy where we assume the most interesting data points are the most frequent when in reality the inverse is almost always true — common is boring. If you were to go by the news, men bite dogs way more often than dogs bite men. But that's only because "man bites dog" is worth reporting and "dog bites man" is not.


Or maybe they're just all pretending to, because they're all saying this in the company facebook discussion, knowing it's trackable back to them, so they're trying to prove loyalty.

It's a disconcerting system.


I thought the point of "culture fit" was to make sure one only hires employees who truly believe in the company and its ~~religion~~mission.


> in spite of all the evidence to the contrary?

evidence of the contrary? What is your evidence to measure the value of Facebook is not outweighted by its costs?


What material is "the bubble" made out of if Facebook employees immediately jump to "spies" when it comes to leaking something like this? Has it been so long since they've had contact with soul-containing humans that they forgot how they operated?! Zucker-bot must be replicating ye olde FB HQ.


Well, selection bias aside consider that Facebook (along with the other Big Tech firms) have for years now engaged in a hiring process that specifically targets fresh young graduates who are told they have to absolutely be the best of the best to get through an interview process. "Cult" is the word that comes to mind when I think of the hiring practices and cultures at these companies. It shouldn't be surprising when a substantial number of them have this attitude.


I don't think it's fair to say the entire company believe it was spies. Clearly a lot think it was just a few jerks. That said, FB is almost certainly infiltrated by a few governments. It's not exactly a small company. If China wanted to get someone on the inside, it couldn't be that hard.

But IMO, this was just an employee who admires Snowden.


> just a few jerks.

Where by "jerks" you mean "people with deep misgivings and the courage to risk their careers by speaking out against one of the world's premier surveillance organizations?" This isn't the kind of thing you do for lulz, and I doubt they're getting paid.


I think was sidlls said is certainly true, but it's important to remember that media organizations like to focus on things that would be interesting to their readers.

Yes, this cultish behavior is real, but I'm sure there are just as many jaded employees or else this leak would never have happened.


Undiluted self-righteousness. Told they're the best and most worthy by their schools, thence to an employer where they tell each other they're the most virtuous and important. There is an yawning gulf between how they see themselves and how they appear to the rest of the world.


Funny how they think spies would be more interested in leaking internal memos than compromising their orwellian treasure trove of data.


Frankly, it sounds like a cult.


It's interesting to see how Facebook reacts to internal corporate data being shared, in reverse to Facebook sharing private user data to third parties.

Boz complains that his memo was taken out of context and he doesn't even agree with it anymore, yet everyone is judging! Gasp! Facebook on the other hand totally connects people by creating and selling ad profiles on said connected people. Based on data people shared years ago, out of the context, disagreeing to what they said years ago! And that's a good thing, right? Because it connects people! Ugh.


I've got an idea. Is there any way to place targeted advertisement inside Facebook's internal communications feed?

Based on Facebook's latest reactions to media coverage and memos shared to public I was able to deduce an ad profile which I'd like to sell. Facebook, it appears, might start looking for external psychological and legal counselling, and I might have third parties interested in targeting that circumstance.


"Hi, it looks like youre trying to type an internal memo and the language youre using suggest depression, angst and anxiety! how can i help" --Facebooky


> Andrew “Boz” Bosworth, a vice president at Facebook

~sigh~

"Boz" is the only person I regret introducing to programming. (I convinced him to attend the 4-H project where I taught BASIC)


Do you still teach? I ask because when I obtained my C.S. degree, the curriculum required that students take a course that focused on ethics, entitled: Computing, Society, & Professionalism [1.]. I think this would be hard to squeeze into a BASIC course, but discussing the implications of use/abuse of technology is valuable.

[1.] https://www.cc.gatech.edu/fac/Amy.Bruckman/teaching/4001/fal...


Not many students see their ethics course as anything more than a writing credit, with some resentment.


I agree. It feels forced. But exposure is important.


Would love to hear why you have such regrets?


There may be a clue in the story we’re commenting on.


Wow, as according to this leadership was fully aware that collecting cell phone contact data is unethical, but they did it anyway because the end justifies the means. Scary stuff. I always thought it's awesome to work there, with free food and games and stuff but I guess it all comes with a cost. Facebook internally seems like a pretty unhealthy community.


The leaker simply "connected" Facebook to the outside world.


Here the employees thought they were just sharing among "friends" and someone went and used their information for other purposes.


Yes! Well said.


Seems like this is a bit of a dupe, so I'll repeat my initial comment:

There are many, including those inside Facebook, that are actively asking for more federal regulations. I don't do regulation discussions. Brings out too much stupid.

But I will say that this is a survival ploy. If you get regulated, you are now approved by the government. You're good to go, able to operate freely in society. You just have to abide by whatever the regulations are.

Facebook may have a business model that just doesn't work in a society that values privacy, anonymity, and small diverse groups of people with widely-varying mores. This seems to be the lesson Facebook itself is learning now about, well, Facebook. What do we do then? It's not like anybody at Facebook is going to bring that up. They've got a ton of money. What do we do if the existence of Facebook itself is unacceptable?


> If you get regulated, you are now approved by the government.

Not necessarily. I suppose it would be possible to regulate it into nonexistence, by undermining the privacy-invading business model. You could probably get at least part-way there by requiring that profile-based targeted advertising and data collection be explicitly opt-in with informed consent. IIRC, the GDPR has a lot of good stuff about how consent must be obtained. Further regulations could mandate that services and incentives cannot be provided conditionally based on tracking or accepting profile-based targeted advertising, etc.

If the above regulation takes effect, much of the business value of profile data evaporates. Facebook would only be left with eyeballs to shove non-targeted ads in front of, and maybe generalized market research.


“That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.”

If you justify "questionable contact importing practices" then you aren't putting up a straw man. You are talking honestly about your "questionable" decisions and trying to defend them. Thus you should expect outrage from those who realize you've lied to the public using "subtle language" in order to grow at any costs.

In other words, the self-awareness to call out the "subtle language" and "questionable practices" implies that the company pursued those growth strategies despite CLEARLY knowing they were sketchy. This is extremely damning but those in power will try to deflect the problem as if someone leaking it is the issue.

I never would have worked at Facebook before due to their questionable policies. This incident reinforces that and shows that the problems come from the top. Now we'll watch debate there get muzzled in the interest of staying out of the news and growing at all costs. Perhaps a success for shareholders, but a failure for rank and file employees with morals.


The work we will likely have to do in China some day

Social media disruption disintegrated Libyan society to the point it could no longer function. The Arab Spring was a test firing of a psychological nuke. We should all be concerned that Facebook is planning to attack China, because they will defend themselves.


I didn't interpret that as Facebook planning to attack China. To me, it sounded like the work they'd have to do to cooperate with the Chinese system of censorship and surveillance in order to get their product accepted there. (E.g., build a version of PRISM[1] for the Chinese government, just as they did for the NSA.)

This kind of collaboration with an authoritarian regime is something that Facebook employees who value freedom might find to be distasteful, but if it furthers the company's lofty goal of "connecting everyone in the world", then, hey, it can be justified.

[1] https://en.wikipedia.org/wiki/PRISM_(surveillance_program)


I'm kind of amazed over the attention FB is receiving over the data leak and their general attitude over privacy. They have been incredibly clear that they don't value privacy at all, Zuck even said so years ago.

- They have a huge spam-bot problem that they willfully ignore; - use aggressive email and notification technique to re-engage users; - stone-wall traditional publishers to accommodate their distribution techniques - sell and trade personal user information for advertising - Have most security and privacy settings buried under submenus, almost intentionally built to ignore - Have a history of shady apps and games on their systems with questionable value to the user - Have a culture of growth at all costs, without regard to the substance of that growth.

All of the above was known for years, has nothing to do with the recent breach or 2016's election. I just can't understand why this is a surprise or why we should trust a word FB says.


Zuck even said so years ago.

Zuck also got a free pass on his “young people are smarter” comments. I hope the rampant ageism catches up with FB too.


Exactly. Sure, he put out a press release and some PR moves "expressing regret" of his "word choice", but does he really regret or care? Impossible to tell when he is sincere, impossible to trust him.


Cue Facebook with the fake victim complex.

I respect their level of propaganda skill how they are trying to shape the narrative into that they have been victimized, are fully justified and are not actually at fault.

What impresses me most is how homogenously fanatically unified their culture is.

It’s said a cult is that which can only survive by cutting itself off from reality.

What scares me is that people who are so fanatical about their mission and Corp are responsible for so much real human outcomes... The best lack all conviction the worst are full of passionate intensity.


Facebook employees are perfectly suitable to work in spy organisations, such as in the NSA. They seem to all agree on “punish the leaker” attitude, instead of self reforms!


Well, that was unfortunate for Facebook. Zuck was trying to mitigate the PR disaster that Cambridge Analytica has produced last week, telling that mistakes were made. Now there is another whistleblower who basically destroys all credibility of anything Facebook will have to say in the upcoming weeks. It appears that Zuck's lieutenants have a distinct understanding of what's good for them and what they think of what's good for others. Either the head does not know what the hands are doing, or they all are lying and all that is common practice in the entire user data fencing industry.


The tone of the responses are objectively creepy. Almost cultish. It makes you wonder if FB has become known as an amazing place to work because their HR org had become so good at selling the FB culture to anyone who joins to keep them in line.

Now that mirage has a few cracks in it and executives are freaking out.

Reminds me of Westworld.


You can see the same approach at Google. Leaks are characterized as betrayal of the Google family/social group. The goal of this seems to be to ideally prevent people from leaking things they can't be fired for leaking.

You leak of trade secret, you can be fired. But if you talk about work practices, that's a legally protected right that these companies don't want you to talk about. If they can make people who do feel ostracized and like they're betraying their family, maybe they'll reconsider sharing what goes on behind the curtain.


There are three kinds of leaks. Those that leak upcoming products or company roadmaps, those that leak internal company reorgs, meetings, mailing lists/announcements, and those that leak gripes.

I assume you're ok with people being mad about leaked products they've been working hard on for years having their big debut ruined?

So why would employees be mad about the leaking of internal meetings or discussions? Because in the majority of the cases, these leaks don't hurt company management, they hurt employees.

Several times in the past, Googlers have had their physical or mental safety threatened by such leaks. What you see as a company discussion leaked, turns out to be a real life Doxx-ing of employees. Would Damore have been fired if someone hadn't leaked his memo externally? Probably not, he'd be quietly transferred by HR and maybe managed out, but able to preserve his career elsewhere in the valley. The response to his firing was that the internal company meeting questions got leaked and the real names of several female and transgender employees got forwarded to alt-right sites, which proceeded to harass them, send death threats, etc.

And one year, an "innocent" leak of Google's yearly company Christmas present, a brown envelope with cash in it, put the physical safety of Googlers at risk, because Google Shuttle bus stops are well known and Googlers are easily identified in public, and that day, everyone knew Googlers would be carrying hundreds of dollars of cash in brown envelopes.

These leaks often harm employees, not management.


Allegedly, Google's confidentially demands go well beyond not disclosing product launches, as a lawsuit that, to my knowledge, has not been concluded, has claimed: https://www.scribd.com/document/334736972/John-Doe-vs-Google...

Furthermore, you can see in this letter from Brian Katz, the head of stopleaks, more or less everything discussed here, and certainly, it coming from "management": https://regmedia.co.uk/2017/05/22/katz-letter-google.pdf (Actually, everything here is useful/interesting: https://regmedia.co.uk/2017/05/22/discovery.pdf )

Note that he expressly suggests bringing concerns to a manager rather than speaking publicly about them, which likely includes concerns about working conditions or illegal conduct. He also cites the whole "damages our culture" aspect that is extremely relevant to the topic at hand.

Edit: Removed off-topic addressing of removed content from parent post.


Google employees who leak official confidential information: unannounced products, internal company documents, etc can get fired. There's nothing shocking about that. That will get you fired at pretty much any workplace. These are not "whistleblower" leaks.

If you leak say, a private thread from internal mailing lists, you could lose your job, especially if it could harm other employees. If you summarize some issue, you probably are on steadier ground.

Google has a vibrant internal culture of criticism. Memegen has already leaked (https://gizmodo.com/5946769/google-workers-make-internal-mem...), execs and products get raked over the coals at weekly company wide meetings, there's actually internal comics that satirize company beliefs.

And yes, management doesn't always listen, probably the single biggest reason unionization might be needed in Silicon Valley, not for worker pay and benefits, which are already good, but to ensure management listens on other issues.

BTW, the TGIF transcript and meme leaks are the kinds that have ended up Doxxing employees and triggering harassment. The gizmodo link I put properly redacts the usernames, but other leaks have not.


The "doxxing employees" thing only carries so much weight with me, because I feel that your correspondence in a corporate environment which impacts billions of people should have accountability, and the corporate party cannot be entrusted with it.

People should not say things even in a private corporate setting that they would not be comfortable if was someday public. Not only can a lawsuit or investigation bring internal correspondence to light, but for instance, some of my correspondence could be FOIA'd! People fail to realize their corporate correspondence is not their private place, and treat it accordingly.

I do have a lot of sympathy for targeted folks like LGBT folks getting harassed, particularly after events like Damore and the related lawsuits and embedded posts/content. But in many cases, they are already publicly identifying their RL name, gender identity, and political positions on Twitter, for example. They weren't so much "doxxed", as they became higher profile due to having been in the news.

This sucks a lot, but we can't hide important information from the public based on the unfortunate reality that soul-sucking nightmare trolls chase every name that gets even the briefest of public attention... imagine if we took that approach/mindset with leaks coming out of the White House.

PS - I really wish more of Memegen was leaked/public. I can't count how many times a news story has led to the thought "darn, I wish I could see what Memegen looks like today".


Other employees doesn't like leakers because they're selfish and immature. The leaker gets to continue to work their cushy software job while forcing PR people to work overtime. "Leaks" like this don't help anyone besides journalists anyways. The most obvious example is the Jame Damore diversity "manifesto." I can't conceive of how that leak was supposed to accomplish anything but generate controversy.


> while forcing PR people to work overtime

As opposed to doing what else? Emptying bins?

They are employed to spin news for the benefit of the corporation. They have risk assessments and plans-of-action for the most likely eventualties. Their job is to do exactly what they've been doing in response to this leak.


A janitor is employed to clean. That doesn't mean you should throw your garbage on the floor.


Leaking Damore's memo worked exactly as the leaker almost certainly intended, as Cromwell points out below: It got him fired, which is what many Googlers were demanding internally before it became public.


And do you think that Googlers shouldn't be mad at the leaker?


I definitely know of both Googlers happy it leaked and Googlers upset it leaked. At 70,000 some-odd employees, "Googlers" isn't a homogenous group.

Google, as a corporate entity, definitely did not gain from the leak, and certainly would've preferred it not leak.


You implied that the attitude towards discouraging leaks is cultish, but it seems rational to me.


The Guardian has a great piece on this, https://www.theguardian.com/technology/2018/mar/16/silicon-v...

Below is an excerpt

———

The public image of Silicon Valley’s tech giants is all colourful bicycles, ping-pong tables, beanbags and free food, but behind the cartoonish facade is a ruthless code of secrecy. They rely on a combination of Kool-Aid, digital and physical surveillance, legal threats and restricted stock units to prevent and detect intellectual property theft and other criminal activity. However, those same tools are also used to catch employees and contractors who talk publicly, even if it’s about their working conditions, misconduct or cultural challenges within the company.


I wonder, I work as a consultant, and if you think surveillance at a place like Google or Facebook (from what I've been able to tell at interview), surveillance there is minimal.

McDonalds, and more general retail businesses, Target, Walgreens, ..., call centres, ... those are the ones both installing massive surveillance and actually getting it analyzed. Or at least, enough to pay me loads of money to make it possible, and I've had some conversations, they do this routinely. And frankly, I'm pretty sure that since this is directed by low-level managers (you can't do it otherwise, not at that scale, all tracking is done by the store manager) there will be tons and tons of abuse of these systems. I mean for trivial reasons, like stealing, attempting to make a false accusation of stealing stick, stalking girls (or even men), that sort of thing.

From what I can tell, at FB the entrances and exits, some presumably high value locations inside and to a lesser extent the parking lot, those are the places under surveillance. I'm pretty sure that at most places in the FB buildings nobody can see you on any security monitor (it's one of those open plan offices, not much privacy, easy to get everything under surveillance, but they're not doing it. I've been to several call centres where surveillance was utterly pervasive in the same sort of environment. Although the environment was much, much better at FB. Big open office plan, but the air was perfect, for lack of a better word. It wasn't cramped at all, it wasn't like those call centres at all, no wall smelled, not like smoke, not like oil, nothing like that). And of course you hear stories, like "sneaking" into the office on sunday for a board game because it's an easy place and central for everyone to get together is pretty normal.

Of course I realize that this is "fake" like it is fake at any company. Unless you're the CEO AND own a majority of the shares, the company is not a social club, it is not there to have your back. But they have that vibe going there, and they wouldn't have that vibe if they broke it for trivial reasons.

> legal threats and restricted stock units to prevent and detect intellectual property theft

Protecting yourself against insider threats by giving employees means for a good life ! How UNAMERICAN !

I'm pretty sure at your local supermarket you'd find those same threats, except they're totally pervasive. Every employee will have been threatened, and I assure you, ...

... not with restricted stock units or any kind of reward.


Do you need to surveil employees with cameras when you can monitor everything they do on their computers and track where their phone is in the building?

http://www.businessinsider.com/facebook-employee-concerned-c...


Judging by most callcenters I've consulted for ... yes.


> Protecting yourself against insider threats by giving employees means for a good life ! How UNAMERICAN !

Discussing working conditions publicly is legally protected by worker rights laws.

So yes, protecting yourself against that (by firing people for discussing working conditions, for example) is both un-American AND illegal.


There's also selection bias at work here though - the only responses that'll get media attention are the ones that are novel or 'newsworthy' - we aren't seeing the normal boring comments.


It's unusual for a company that big to have so few leaks in their 14 years of existence, especially one that is that high profile.

My understanding is that FB does not have a "fear-based" culture that would've prevented leaks, so really the only way they could keep people in line at scale is if there was a cultish element to their onboarding process that makes people "love" FB so much that employees are actually so offended by a leaker to make comments like this.


But it does have a fear-based culture when it comes to leaks.

https://www.cnbc.com/2018/03/16/facebooks-secret-police-catc...


Fairly capable private investigative and security forces are pretty common for tech companies now. I am not surprised Facebook falls into this category.

Google's is run by a guy named Brian Katz, who's been the subject of a lawsuit before: https://arstechnica.com/tech-policy/2016/12/anonymous-google... and the very same has allegedly threatened a bartender who found a prototype Nexus 4: http://www.dailymail.co.uk/news/article-2224589/Google-threa...

Apple security folks have tossed a guy's home looking for a prototype iPhone, escorted by police. All of them, allegedly, had badges: https://gizmodo.com/5836990/lost-iphone-5-investigators-were...

They obviously can intimidate employees into silence, but it's far more useful and beneficial for morale to go for the "loyalty" angle, and make sure employees shame others who leak.


you'd be surprised by the number of "job opportunities" where Mike Ehrmantraut skills are required (jobs subcontracted to subcontractors) in multinational corporations.


Identity and cognitive dissonance.

I would imagine people at FB that believe working there is objectively good are having a hard time reconciling that belief with the negative externalities that are playing out in society at large.


Everyone puts them (and other Big N) on such a pedestal that when they finally land the job they feel like the company is a reflection of them as a person.

If you see working at one of these companies as a status symbol, you'll do whatever it takes to protect that image.


Agreed, but I would point out that if I were an FB employee and I disagreed with the memo, I certainly wouldn't be saying "yeah, we suck" on the highly-traceable internal discussion board! I'd say that stuff face-to-face with people I trusted at the water cooler or something.

There's a strong selection bias at work here in these posts and I think it's a serious statistical error to assume this represents the median view of Facebook employees.


Yea, they sell culture and give away free food. As you can see[0], most devs at FB are eating, traveling, eating, going to cafes or eating some more. ;-)

[0] https://www.youtube.com/watch?v=hWFDujYzvbI


That is some real interesting insight derived from a gossip column about a few facebook posts.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: