Hacker News new | past | comments | ask | show | jobs | submit login
Facebook and the Normalization of Deviance (newyorker.com)
89 points by mitchbob 7 days ago | hide | past | favorite | 54 comments





I was just thinking the other day about what it must be like to work for Facebook on their security team (I'm in security myself). On an infosec team I have an idea of who the attackers are - there may be multiple, ranging from disgruntled employees to nation states. I think about their incentives, capabilities, and how to reduce risk.

At Facebook... I'm working for the attacker. You're protecting users from some set of other attackers, of course, but like, if the idea here is "reduce harm to the user", you're working for one of those attackers.

Facebook is essentially normalizing a breach. It's not really particularly different from some threat actor gaining access to a database, except that the users "consent" (if such a thing is possible - as if users understand the implications of their consent, not to mention the tracking that happens without consent).

As the post notes, and as Facebook states, " Longer term, though, we expect more scraping incidents and think it’s important to . . . normalize the fact that this activity happens regularly."

Of course, this makes it clear that protecting users isn't the goal, protecting the company's interests is the goal. It feels like engineers and others may be in need of a hippocratic oath - a commitment that makes it clear that, while you may be employed by some entity, you must do no harm. You have ethical responsibilities that extend beyond the scope of your employment to that entity.

> Neither will I administer a poison to anybody when asked to do so, nor will I suggest such a course.


Are we the baddies?

You could say the same of IT in insurance or banking though.

Brave post! Seems whenever you imply on HN that developers should follow some code of ethics, and be held responsible for the ethical impact of the systems they make, people look at you like you have a horn growing out of your head. “But what if my manager TOLD me to write that evil code??” “Hey, I’m not making the requirements, I just implement them!” According to a lot of replies you will get, we are just helpless developer drones, with no say in what we are building.

"Please don't sneer, including at the rest of the community."

https://news.ycombinator.com/newsguidelines.html

Also, please do not post flamewar comments. It's not what this site is for.


> Brave post!

Not commenting on op’s point, but let’s be honest - saying that Facebook is evil on HN is as brave as saying that Fox News is evil on CNN.


I think they mean less of the "Facebook is evil" and more of the idea of any kind of regulation or personal responsibility/liability with regards to tech workers.

Teams must be responsible, ie. when harm is done, they know how to amend it fast. Company culture need to prefer this beforehand. Accept the added costs, to reduce those risks and damages. It's part of fix your own shit, don't export it to hostages.

Accountability will need to be with the people calling the shots.

There is education for all levels in this.


[flagged]


I think Zuckernerg is a good person. Heck, I think most people are good people who simply disagree with me in what is right, or what trade offs to make. Going about thinking your enemies are evil is no way to run a polite society. After all, evil people ought to be jailed and jailing your political opponents is a poor way of winning an argument.

No you don't. There is no algorithm from any philosophy in history that evaluates his behavior to "good." Besides that, what you "think," or say you think is irrelevant. People lie and you're a people, so you saying you think something doesn't mean you actually do, or if you do, it's an evidence based thought. Perhaps you were raised to think "all people are good deep down," but that's a ridiculous and dangerous assumption. Naive at best.

Regardless, if you think he's good then you aren't.

You've provided exactly no evidence that he's good and there's an abundance of evidence that he's bad, which I provided and is readily accessible with even rudimentary investigation.

If you intentionally cause suffering to benefit yourself, you're a bad person. Period.


I appreciate the unsubstantiated downvotes. It confirms my point of view.

I don't disagree with your point, but what the hell - "lecherous, creepy, misogynist purposes"?? Zuckerberg built Facebook to propagate his goal of harming women? What?

See https://www.thecrimson.com/article/2003/11/19/facemash-creat...

Zuckerberg initially made Facebook so he could creep on the girls at Harvard.


I honestly don't see anything creepy about that. Let alone "misogynistic".

edit: Oh, gathered without explicit consent. Well that's dumb of him. Still, they're public photos...

edit: Okay, the article says "from the facebooks", and "hacked into", but that might mean whatever, so I'm willing to tentatively grant "creepy" on the basis of lack of consent. I still think misogynistic is a stretch.

At any rate, I don't think this makes Facebook misogynistic even granting that this website concept is. Really, what this says to me is mostly "Mark Zuckerberg has no understanding of privacy or private space." Which, um. Yes.

edit: To clarify, the reason I think it's a bad take to call this creepy and misogynistic, and that I'm annoyed enough to raise the issue, is that it's fairly well-established that men tend to rate women in their heads, because they're more visually focused on average. So this just seems like another instance of taking a totally normal, widespread male behavior, adding a social disability on top, and calling the result creepy - a creep and misogynist in other words being a person who does "being a man" badly, like the uncanny valley of gender performance. I'm not defending Zuckerberg in general here, or even the thing he did in itself, I just find the framing problematic.


Equating actual, illegal signature fraud with facebooks facilitation of free speech - even speech that you, personally, disagree with - is a far bigger problem than vaccine misinformation.

Non sequitur, straw man, false analogy. What you're saying here is totally irrelevant to anything I said. Legality and morality are orthogonal. Violating people's privacy and messing with their dopamine receptors to manipulate them into viewing ads because dollars are more important to him than people's mental well-being is not free speech.

There's some real crazy quotes in that piece

>"According to the newspaper, Zhang was fired for “spending too much time focused on uprooting civic fake engagement and not enough time on the priorities outlined by management.”"[...]

>"Facebook’s products enabled corrupt governments to create fake followers and fake “likes,” which then triggered Facebook’s algorithms to boost their propaganda and legitimacy. According to the Guardian, when Zhang alerted >higher-ups about how this was being used by the government of Honduras, an executive told her, “I don’t think Honduras is big on people’s minds here.”

Just ponder that attitude for a second and ask yourself if it can truly be said that Facebook is still in any way shape or form aligned with human flourishing.

And more importantly small countries should take note. Their communication infrastructure is controlled by people who treat them like pocket change.


If he was tasked with a job regarding domestic (US) activities, why should he be looking into Honduras?

>ask yourself if it can truly be said that Facebook is still in any way shape or form aligned with human flourishing.

FB contributes a lot to many open source projects such as Godot and OCP. Please stop being so dramatic


Just FYI, something like 15% of Hondurans live in the US [1], and even higher for Salvadorans, and not unrelated, the murder rate down there is extremely dramatic.

[1] https://en.wikipedia.org/wiki/Honduran_diaspora


If you get some open source breadcrumbs, then you're fine with attacks on civilians and ethnic cleansing? (Facebook, Erdogan (in Turkey), Myanmar)

"being so dramatic" you said about things like that?


Note for headline-only readers: the “normalization of deviance” here refers to organizational issues, referencing the term as it was coined in regards to the events leading up to the Challenger incident. It is not, as far as I can tell, intended to be a stigmatizing remark about behavior considered “deviant.” It’s probably worth noting this since it’s easy to interpret it that way from the headline.

Problem solving in NASA is a very different story compared to problem solving at Facebook.

NASA hires rocket scientists to build rockets. Enough talent existed in the military working on that stuff, even before anyone started day dreaming of flying to space.

Facebook on the other hand hires computer scientists, programmers and an assortment of random over-energetic mindlessly ambitious buffoons (too ignorant to fully comprehend how ignorant they are outside their domain) to work on Sociology, Political Science and Social Psych problems.

What solutions can anyone expect?

Many of said created problems are barely studied by domain experts at such scales.

So expect solutions to always be half baked and create their own new issues for a long time, before any real breakthroughs happen. Its like waiting for Meteorologists to announce they have finally worked out how to control the path of hurricanes.

Ofcourse the difference here is Meteorologists dont create and inject chaos into society. Facebook does and is well rewarded for it.

Its high time they are punished.


This is the "move fast and break things", "grow at all cost", "ask for forgiveness instead of beg for permission" culture. Facebook is the most obvious and visible example of this, but I fear it's permeated far beyond that in our industry.

It's "walk over dead bodies"

Still don't think facebook did anything wrong. What's the difference between their recommendation algorithm and Netflix? If people want things you don't like then your beef is with them not facebook.

Websearch for "Facebook Myanmar genocide" and "Facebook WhatsApp lynchings" -- Netflix doesn't make such things happen


I'm not complaining, but I'd like to point out that I think this violates HN's terms of service: "You agree to not use the Site to:

email or otherwise upload any content that (i) infringes any intellectual property or other proprietary rights of any party;"

It seems like sharing a link to copyright infringing content (archive.is) could be considered Contributory copyright infringement (https://en.wikipedia.org/wiki/Contributory_copyright_infring...)

By analogy, how is this so different from posting a link to a pirated movie originally from Netflix? Because it's text?

I think we have a case of normalization of deviance right here on HN. Illegal or stolen content is posted in the comments here all the time.


Not sure what you mean here. Opening the original link in an incognito window yields full text of the article (maybe you are out of free articles, and they are counting?) Submitting articles to archive.is or archive.org serves the important function of maintaining an accurate historical record, and prevent link rot.

Rules were made to be broken. In this case, dang themselves has commented on the matter[1]. Complaining about paywalls is off topic and discouraged, and linking archives is tolerated.

I disagree with your sentiment vehemently. Firstly, law is not morality; people break the law all the time, every day, at their own discretion. It was made by humans and is not infallible. Secondly, I can’t just do a few GET requests to Netflix and get the full movie, yet this archive.is scraper has no trouble. So I guess it wasn’t very protected to begin with.

The internet giveth, and the internet taketh away. They can get their better protection by not serving full articles publicly. They presumably do not because they would like to eat their cake and have it too with regards to indexing.

This is hugely off topic so apologies but there is a reasonable chance I am not replying beyond this point.

[1]: https://news.ycombinator.com/item?id=10178989


People complaining about FB but not discussing a viable alternative, or even what we have to change to enable a potential future viable alternative isn’t super helpful.

I think the real problem is that a small set of companies are exercising control over what people see and read.

Facebook's censorship (which includes algorithmic timelines, prioritizing some things and censoring others) is an existential danger to the wellbeing of our society. Any organization that can decide what links and text can be shared from neighbor to neighbor or from friend to friend is in a position of massive power.

They even prohibit links to certain tweets in DMs.


Aren't their algorithms all designed to optimize for engagement? Isn't engagement just the measure of how much time you spend on and interacting with their system? Doesn't that mean that they're not deciding what to keep feeding us, but rather that we are?

It seems like what they've done is basically automate, "Give the people what they want.", as it were. We produce the clickbait, we click on the clickbait, then we get fed more and more of what we click on.

I tend to think that this probably boils down to rent-seeking via exploiting self-destructive human behavior/biology, so I'm not too fond of it for those reasons. But, that doesn't seem very much like Facebook censoring things as it does them feeding everyone exactly what people indicate to them with their clicks and comments they'd like to be fed.

I suppose there's an interesting discussion about what lengths companies like Facebook should go to save people from themselves, but to me they're basically another form of vice monetization like cigarettes, alcohol, etc.


It seems like what they've done is basically automate, "Give the people what they want.", as it were.

Yes. Facebook profits from giving people what they want, but did not create the demand. What's scary is the quantity of demand for nuttiness.

The other big feature of social media is that it brings diffuse minorities together. Before social media, outliers were unlikely to meet those with similar inclinations. Social media powered cohesion of groups from gays to Proud Boys to flat-earthers.

Fox News probably influences more people, anyway. Should they be "brought under control?"

I'm more worried about the trend towards censorship everywhere.


> The other big feature of social media is that it brings diffuse minorities together. Before social media, outliers were unlikely to meet those with similar inclinations. Social media powered cohesion of groups from gays to Proud Boys to flat-earthers.

And this is mostly a great thing. Like the internet started this, but Facebook (and Twitter I suppose) made it mainstream.

We're essentially in another time period like the one after the invention of the printing press, where the mechanisms for controlling the spread of information break down, and something new is born.

Unfortunately, it's much easier for journalists/policymakers et al to blame "Big Tech" rather than realise that to some extent, we are all to blame.


> Aren't their algorithms all designed to optimize for engagement? Isn't engagement just the measure of how much time you spend on and interacting with their system? Doesn't that mean that they're not deciding what to keep feeding us, but rather that we are?

This would be true if everyone were the same. However, the stuff that engages with the median isn't the stuff that engages with everyone.

Their prioritization of popular things means they censor anything not popular by pushing it down the feed, in some cases excluding it entirely.

Not everyone "wouldn't believe what happens next!"

It's pushing people's attention toward median melodrama, and hiding things that involve nuance or anything not engaging, like, for example, announcements of deaths of friends. This would be possibly excusable under your model, but they also censor what links you can share privately with friends in DMs on instagram, whatsapp, and fb messenger.


Doesn't that mean that they're not deciding what to keep feeding us, but rather that we are?

Not exactly. Think about fast food. You are genetically programmed to enjoy fatty, sugary, salty foods because these were hard to come by back in the day. Now they are easy and companies push them on us, by exploiting this evolutionary development in a harmful way.

Similarly all your social instincts like FOMO are baked in, and Facebook targets them ruthlessly.


> Similarly all your social instincts like FOMO are baked in, and [TV/radio/books] targets them ruthlessly.

Are my rephrasings of your statement any less true, or is FB somehow different?


Are my rephrasings of your statement any less true, or is FB somehow different?

Yes. A TV show has to appeal to a million or 10 million people and the feedback cycle is a season long. It has to stay pretty close to the “average”. There's a limit on how strongly it can pull any lever, and pulling that lever might decrease influence with half the audience as much as it increases it with the other.

Whereas Facebook can tweak the experience for a single individual with a hundred feedback loops in a day and when it finds the right level pull it as hard as it can. That’s new.


But it doesn't. Have you ever used facebook? Essentially it shows you whatever the highest engagement content is, which is normally a birth, engagement or some other life event.

Speaking as someone who develops ML models, there's no way that they are iterating multiple times per day, and it is 100% not tuned to specific users as they have so many.

And how are they supposed to measure the impact of the lever?

These are not trivial questions, but they are often treated as such in online discourse, which is pretty disappointing.


This reminds me of the Eloi and the Morlocks of "The Time Machine". The hapless Eloi are the general public, whose data are treated as tasty flesh to snack on by data scrapers like FB and Google.

Is that really the problem? When in history did people get their news and information from a large and diverse set of sources?

For as long as there's been national news, it's been dominated by a few headline sources. Before that there were lots of local sources, but nobody was reading the local news from both Connecticut and Colorado.

I don't know what the difference is today. Maybe there isn't a difference, we just prioritise modern problems over the same problems in ages past. But I really don't think the answer is a narrowing of sources, unless your point of comparison is something like fifteen years ago, for the small subset of the population who were mostly getting their news online at that point.


"For as long as there's been national news, it's been dominated by a few headline sources. ... I don't know what the difference is today."

The difference is that a diverse, global set of individuals are becoming content creators, and a potentially global megaphone is being passed on to them.

Early on in the Internet's history it seemed that this emancipation of the individual voice from corporate and government control was going to be permanent, but governments and corporations have proven to be very adept at adapting and clawing their way back to positions of control.

And, in a twist that few expected, most individuals were happy to cede control back to governments and corporations for the sake of convenience and because they wanted to be protected from the speech of other individuals.


> When in history did people get their news and information from a large and diverse set of sources?

Lots of people get information from friends and family and their community directly.

Facebook is now mediating these private communications in DM, and censoring them. There are whole sites you simply cannot link to in DMs to your friends and family.


Can you give an example of such a site?

Twitter. Facebook censors certain links to tweets, even. That's not an example of a domain that is wholly banned, but it's a good example of the practice.

Huh? I very much doubt that fb block all twitter links, so can you be more specific?

Facebook in the last year or two was blocking the sending (in DM) of links to at least one (public) tweet by a family member of the US president.

Ah OK, so it's a specific tweet.

That's less concerning to me, as they've been doing that for a long time, for various reasons.

I mean, it's not like they blocked all links of a competing social network or anything \s.


It's easy to villify, but we need to recognize the top-down management culture is doing something other than what we say. It is not servant leadership, but stealing worker responsibility putting it in the hands of others, externalizing costs, etc.

The resulting culture follows from that, but is reported green upwards.


For soldiers and medical personnel there are ethical guidances and law to follow. Why not make one for developers?

How do you create ethics for something like programming? Coding is not what needs ethics, it's management.

Professional engineers, too.

We mustn't forget the cambridge analytica shenanigans either.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: