Hacker News new | past | comments | ask | show | jobs | submit login
Zuckerberg: I just shared the following note with our employees (facebook.com)
83 points by tech-historian on June 5, 2020 | hide | past | favorite | 160 comments



> I believe our platforms can play a positive role in helping to heal the divisions in our society,

Facebook systematically promotes and propogates the most inflammatory content posted on its platform. Why? Because it is the most engaging. This is part of the intentional design of Facebook, and is the fundamental reason there are so many problematic things happening on the platform.

Mark's "thoughts" on the topic do nothing to address this in any way.

In other words, the plan is "business as usual."


Precisely. The business of fb is measured in MAU and DAU. These are their key metrics. The only things that drives these metrics are engagement. It’s basically the core of their business. And controversial items generate the most engagement. Unless it’s crossing the line in black and white terms for example like a beheading and most items don’t do that because the world is shades of grey they are good for business as they have the potential to generate more engagement. It’s basically good for business. What’s good for business may not be good for society. Sometime it is. Sometimes it’s not. Some people are ok with this and some are not.


Makes sense that business incentives inform decision making, but how do you explain the difference in Twitter's content policy? They also, presumably, measure their business via MAU & DAU.

Sometimes I think its as simple as humans making decisions. And Zuckerberg is making a spineless decision (by way of inertia).


That my friend is exactly is the difference made by who is leads the company. This is exactly why what kind of a person a leader is so important. Be it for a company or a nation.

We get what we tolerate.


The vast majority of FB users use it for staying in touch with friends and family.

The cynical framing we encounter on HN isn't actually representative of the experience of most FB users.


That gets to the heart of the problem, actually. People use Facebook to keep in touch with family and friends, yet Facebook still systematically pushes inflammatory content on them, despite that not being the "reason" people are there. And, eventually, it gets a surprisingly high percentage of people to respond to that inflammatory content, thus furthering the cycle.


FWIW (and it might not be worth much much).

I'm from Venezuela and most of my contacts are too.

Most content that I would consider "bad" that comes up for people are just memes, and stupid top X/spammy-ish/clickbaity content or recipe videos.

Maybe a different algorithm or just network effects?


It could be any number of things - for one there may not be much advertisment spend in your region compared to the US.


Are you in a target group for voter ads using fake news messages?


Yeah they should just stop hosting public profiles and figures.


The average number of friends that a Facebook user has is 338. Out of those, I'm sure there's quite a few in every person's group of "friends" that posts some pretty inflammatory content. I, for example, have over 800 friends, despite the fact that I haven't posted anything in about 10 years, I don't go on the site more than once every three months or so, and the only thing I use my Facebook account for is messaging my significant other because they mainly use Messenger to stay in contact with their friends. Yet, when I go on for my quarterly check-in because my SO wants me to like a photo, it's nothing but inflammatory posts. I quickly unfriend them and move on, but it still seems to mostly be those types of posts.


i generally use the unfollow button for people who post a lot of stuff I don't like. I got to the point where its really stable, I mostly see what I want, funny memes, pictures of my friends doing things, etc. I use to have 800 friends, went on several purges to get it to 250 - much more managable


The idea that being barraged by insane politics is separate from staying in touch with friends and family is contrary to most people’s experience of national and religious holidays.


Rather than shying from it, dreading Thanksgiving, or hiding behind an eggshell thin wall of group think, people should learn to grow a spine. Learn to defend your own opinions, stop getting overly agitated that someone has their own conflicting ones, and generally stop caring when people disagree with you.

I’m no doctor but I’m fairly certain you’ll live longer.


You seem to live in a different world than mine. In mine, social media would be much less of a problem if people tended to shy away from online confrontation. The online/social media problem is in large part one of people defending their own opinions too aggressively.


> You seem to live in a different world than mine.

I liked the more inflammatory initial hot take of “What kind of world do you live in?” better. It goes better with my speak your mind and damn the consequences philosophy.

I wasn’t talking about online. I meant in person communication. It’s likely even hard for people these days but once you get used to the idea that it’s okay to offend people, you realize you’re never really offended either.

> In mine, social media would be much less of a problem if people tended to shy away from online confrontation. The online/social media problem is in large part one of people defending their own opinions too aggressively.

Having any sort of personal accountability for social media defeats the purpose. It should be an outlet for the id. Be the craziest you and assume everyone else is as well.


Really? Because my parents are probably among the most tech-illiterate people I know (love you, Mom) but I've had to talk with them multiple times because they asked me if such-and-such or so-and-so was true or real and it was the result of completely fake FB groups or accounts posting bullshit that they read on there.

I'm lucky that I have such a good relationship with my family that they're willing to come to me as a source of truth for some of those things.


Have you met my friends and family? Oy vey.


> The vast majority of FB users use it for staying in touch with friends and family.

Out of personal experience it's the opposite. I even talked about a chrome plugin to get rid of everything but friends' activities and they said they primarily use it for just scrolling through news and memes and shit.


Most people who complain about FB on HN, "deleted FB years ago"


Yet our profiles are still alive and well, as our friends are telling us.


I disagree completely.


Just because you haven't been radicalized on FB doesn't mean it's not a real problem.


i find your comment sensible.

there is that podcast called rabbit hole (by nytimes) which was very revealing to me.

i started using youtube before it became a popular platform and never in my life have i ever watched a conspiracy theory video on youtube! i don't think i have ever been recommended one either. i only hear about them in the news.

so hearing that person talk about how youtube recommended videos upon videos to them was really surprising. there were names that i have never even heard of.

so yes, social platforms changing people's view is real and a problem.


Yeah, same goes for books. You know how many people have been indoctrinated and persuaded by books? Let's burn those while we're at it.


People tend to bring this up as some sort of forceful counter-example, but the printing press contributed significantly to religious sectarianism in Europe, fuelling the religious wars of the 16th and 17th century which killed over a third of the German population.

Books might not have been burned, but about 50.000 'witches' did.

A rapid increase in the dissemination of information in systems that have no mechanism to tolerate them is no trivial matter.


Witch burning pre-dates the printing press. Might want to look up the 13th century some time. Didn't know there were any anti-printing press advocates left out there. I thought y'all died off during the Enlightenment? As far as sectarianism goes, the Catholic hegemony was never going to last forever. And Europe was plagued with sectarian violence for thousands of years. Free exchange of information is the one thing that eventually got them to come together.


Books can be publicly scrutinized, facebook feed will only reach it’s targeted audience. I think a more apt comparison is pamphlets at a rally. And even a rally is absolutely scrutinized—or even stopped by the police—if they are found to distribute illegal, radicalizing, or dog whistling material.


In real life there’s a loophole to get around this kind of scrutiny: religion. It doesn’t matter which one really. But if it’s religious and you don’t discriminate against people in obvious ways, you have much wider latitude to say eyebrow-raising things in public.

Seems like that works online too, now that I mention it.


If the cost (monetary and one’s credibility) of publishing a Social post matched that of publishing a book, we wouldn’t be here discussing this.

If we allow anyone with opposing thumbs to spew garbage online, platform have some responsibility to have some limits or require making source/author public


Yeah that's how freedom of speech works. Listen, you don't police private conversations among friends. Not in America. One man's trash is another man's treasure. I'm sure I could comb over your personal beliefs and find all sorts of garbage, including what you're saying now Corporate regulation of speech is a losing proposition for society. Period.


My only ask is that you see the difference between free-speech as intended in the constitution vs free-speech under an unverifiable and easily forged online pseudonym. You may not even be defending free speech of an American citizen/resident


Free speech under an unverifiavle and easily forged obline pseudonym is worth the cost of the speech. If reading my $0 speech incites you to do something stupid or illegal, bad on me, but good luck enforcing that; in the meantime you did something stupid or illegal and will have to live with the consequences.

If people continue to act on information without regard to its source or veracity, and we need to rely on the platforms to sort it out, we're in some deep shit. The telephone company never comes on the line and says 'Hey harikb, your friend is full of it, what your friend said is totally untrue' but somehow we expect that of today's communications platforms. If the telephone company was monitoring the content of your calls, you'd be rightfully pissed.

Of course, maybe these platforms could stop showing me random garbage from random people that aren't on my list, and weren't meaningfully interacted with by my list. (What gets shown on FB because someone I know liked a corporate page 5 years ago is like huh???)

Disclosure: I used to work at WhatsApp, including while it was part of Facebook; my opinions are my own.


I believe free speech is a human right. Facebook itself is not built up around pseudonyms, that would be Twitter moreso. But the founding members of the U.S. regularly wrote in pseudonyms in the Federalist Papers. Ben Franklin also regularly wrote in pseudonyms.

I don't like social media, as it exists today. I think it's a toxic form of communication.

But I'd rather people just realize that and migrate off it and towards chat rooms, group chats, discussion forums, etc.

I'd be fine with forcing social media companies to regularly warn users of addiction and to provide built in and highly visible tools for tuning it out.


It is difficult to get a man to understand something when his salary depends upon his not understanding it.

- Upton Sinclair


We all act like we're above the false corporacy with our casual ways and toxic pecadilloes but it's actually a signal that we're willing to be even more ignorant than the average bloke.


Facebook is hands down the most toxic platform I've encountered. It doesn't get as much flak as twitter because not every 'post' is visible to everyone, but go to your local town's facebook group. Boy oh boy is the stuff you find there is horrid.


Is that a problem with the platform or is that a problem with people?


Marshall McLuhan addressed this question in great depth.


The problem is on all monetized platforms, esp those governed by ML. It will always degenerate into the worst of everything because of humanity ⓧ money equals sadness.


You are entirely right.

I've long said that optimising for "engagement", with no regard to what kind of "engagement" that is, is a paperclip maximiser.[0] Rewarding the posts that get the most comments regardless of whether they are positive or negative or thoughtful or shitposting, and the posts that get the most reactions regardless of whether the reactors (?) have or have not given it a few seconds of thought, is destructive to understanding each other and destructive to democracy.

See, I would rather Facebook not de-platform Trump, or any of his fans, or anyone else I don't like, but let's not pretend that Zuckerberg gives a shit about peace, love, and understanding. They have taken conscious decisions to prioritise "engagement" over anything else, because that is what makes them money. I get it, that's business, that is what they have to do. Likewise, we should do what we have to do to protect ourselves and to protect democracy: Zuckerberg and anyone that works at or has worked at Facebook should, to borrow a phrase, be unto us as heathens and publicans.

(You can extend this to Twitter too. I know, they scored some imaginary Internet points with left-leaning crowds by "fact-checking" Trump, but the truth is, with the algorithmic timeline providing exactly the same rewards to shitposting as Facebook does, they are as much to blame for this too.)

[0] https://wiki.lesswrong.com/wiki/Paperclip_maximizer


I am not so sure - I had to put away my twitter usage because it was all the crap that is going on. I have pretty much dumped facebook because it has nothing but crap on it.


> Facebook systematically promotes and propogates the most inflammatory content posted on its platform.

This is not borne out by the facts. Facebook is a lot tamer than unmoderated, uncurated, algorithm-free platforms.

This idea that facebook promotes inflammatory content was invented by unscrupulous people, with the goal of getting facebook to censor even more than they already do, because this will give their viewpoint the upper hand.


>Facebook systematically promotes and propogates the most inflammatory content posted on its platform. Why? Because it is the most engaging. This is part of the intentional design of Facebook, and is the fundamental reason there are so many problematic things happening on the platform.

Remove the word "Facebook" and change it with MSM. How is that okay?


It's not okay, and it is a real problem with ad-supported platforms. twitter and youtube have both been dealing with this same issue.


The problem is the recommendation algorithm. Oh you like this far right video? Here’s a bunch more!!!


He is clearly viewing this as another technical problem to solve. He's a highly technical person; this follows.

But the issues he's being confronted with aren't technical issues of permissions and algorithm tuning; they're societal problems of power dynamics and information manipulation. These are human traits. They aren't problems that are "solved." They're issues that you have to take a stance on. Do you believe in letting people lie on your platform? What about advertisers? What if the advertiser is a political party?

Further, this feels like it's worth precisely nothing, given what he says right up top:

>I want to be clear that while we are looking at all of these areas, we may not come up with changes we want to make in all of them.

Zuck has shown himself unwilling to accept the responsibility of controlling his platform. He is failing every user and employee of Facebook.


> He is clearly viewing this as another technical problem to solve

I think you misspelled PR.

Facebook has repeatedly followed variants of this script. Do what Zuckerberg and Sandberg want, and if it blows up, find a way to finesse it or at least block and tackle until humanity is distracted by the next outrage.

They lie, distract and keep doing whatever they want.


You mean how after the Cambridge Analytica scandal when he testified in front of Congress? And how literally nothing came from that?


> He is failing every user and employee of Facebook.

I wont agree on every. In fact, many FB employees prefer to avoid getting involved in the mud fight


I'm not a FB employee, and nor a trump supporter. But I'd argue that fighting censorship is also a moral obligation for free speech.


> He is failing every user and employee of Facebook.

Why are users seen as powerless vassals? Why aren't they assigned any responsibility for their actions?

It's entirely possible to use Facebook as a "read-only" Rolodexfor finding people for personal communication. There's no need to "engage" with whatever public posts are "promoted".


>> He's a highly technical person; this follows.

Is Zuckerberg even that technical? I’m aware he worked on the site originally, but never got the impression that he was highly talented (compared to, say, Google’s founders).


Did the Winklevoss twins ever talk about that in interviews? I think they wrote most of the source code for ConnectU, which Zuckerberg was at the very least aware of. I don’t recall if he or Facebook was ever found guilty of misusing ConnectU code in Facebook. In fact, the settlement that was reached between the relevant parties has had many court cases of its own since the settlement was reached, regarding lost value of agreed-upon stock value which had lessened relative to the value suggested by Facebook at settlement. The court cases may even be continuing to this day, but I couldn’t really tell from reading a few wiki pages and sources.

I don’t have an answer, sorry. But hopefully someone else can correct me and fill in the gaps.

https://en.wikipedia.org/wiki/ConnectU


Nothing is really being announced. He said they will discuss a few things but what will come out of these discussions is not guaranteed. Great way to say you're doing something without doing anything.


The most-used verb in the numbered list is 'review'. I hope some action comes of this, but, I dunno, it's Facebook.


I think you have your answer.


I think it's just about time for another Zuckerburg apology tour.


I keep waiting for him to him to say "We have our Top People on it...Top People". I see him saying it with a monocle and white longhaired cat in his lap.


I noticed that, too. Also: “going to”


Remember when Facebook made a few blog posts about how they were going to investigate ways to reduce fake news?


Kind of like naming a street “Martin Luther King Jr Blvd”.


This reddit comment was addressed to the CEO of reddit, but it feels just as pertinent for Facebook and mark Zuckerberg.

https://www.reddit.com/r/modnews/comments/gw5dj5/remember_th...

There's nothing in this announcement, it's toothless. "We'll think about it! We'll try to come up with something!"

The man is useless.

------

This other, short comment from the same thread encapsulates it well, too: https://www.reddit.com/r/modnews/comments/gw5dj5/-/fstgtta


Unless something is illegal, Facebook or any website should allow the content. There will always be people who don't like a person or a viewpoint. It's a slippery slope to start moderating content that is not illegal.


> It's a slippery slope to start moderating content that is not illegal.

It's nothing of the sort. Moderation is critical, or it would degenerate into garbage. Dan does an excellent job on this site, for example.


Why would anyone allow spam in a website? It's nonsensical to assume I meant that.



Is this an argument against the parent post? If so, it's not clear what the argument is.


"Radio Télévision Libre des Mille Collines (RTLM) was a Rwandan radio station which broadcast from July 8, 1993 to July 31, 1994. It played a significant role in inciting the April–July 1994 Genocide against the Tutsi." ... "it is widely regarded by many Rwandan citizens (a view also shared and expressed by the UN war crimes tribunal) as having played a crucial role in creating the atmosphere of charged racial hostility that allowed the genocide to occur."

Real, verifiable proof that any publisher of information or platform that hosts information has a responsibility to ensure that they are not making society worse. This radio station fanned the flames of hatred and is responsible in part for the murder of hundreds of thousands of people.


It can equally well be used as an example of why hate speech should be outlawed rather than moderated.


Regrettably it isn’t that clear cut. While hate is illegal—and rightfully so—racists, bigots and war mongers can and do find ways to spread their hate with legal means. This includes shorthands, dog whistles, framing and tones, etc.[1] I don’t think we will ever be able to conceive a legal framework that can catch all these as hate speech. So we need moderation along side criminalization of hate speech.

1: I recommend the YouTube series The Alt-Right Playbook for an overview of these tactics: https://www.youtube.com/watch?v=4xGawJIseNY&list=PLJA_jUddXv...


> I don’t think we will ever be able to conceive a legal framework that can catch all these as hate speech.

Then why would Facebook be able to conceive a moderation framework that can catch all these? The underlying problem to be solved is the same, it has just been removed from democratic scrutiny.


The radio station mentioned was the inciting element in a genocide. Surely it does not take a large amount of analysis to understand that the underlying mechanism, dissemminating lies or propoganda on a widely used, easily accessed medium, applies directly to OP...


The link is just an example of how some speech is bad.

The point of the original comment was that badness should be determined by laws, and not by the moderation policies of private companies. I don’t see how the linked article forms an argument for or against that position.


The link provides a clear example of where this policy fails: false statements about the Tutsi & other ethnic groups lead to a genocide. None of this was illegal within the definition of free speech in Rwanda - as far as I am aware - thus providing an example where only moderating content on its basis of legality fails.

As a cherry on top, the link also mentions that the radio station was actually endorsed by the government and provided equipment to them. Should the law be the best basis for moderation when the morality of those writing the laws cannot be spoken for?


Facebook moderates a lot of content already. Read their community guidelines and almost anything that’s non advertiser friendly is not allowed.

I agree fb should not moderate content too heavily handedly such that it wrongfully suppresses free speech, and consumers should choose a platform that is healthy for society, but it might be too much to ask for the average consumer.

You also need to realized fbs recommendation engine corners people into ideological echo chambers.


This is not really the largest problem with FB at all. It's that they finely tune us in Skinner boxes to react to shitposts. If the algorithm crosses the line and you back off, it tries something else.

It just so happens that what we humans engage most with is inflammatory bullshit, so of course the algos push that. It's not personal, it's just more money in that.


Most platforms do some kind of moderation, and when a platform is as pervasive as facebook, it becomes essential. We’ve already seen how a lack of moderation on facebook influenced the 2016 election. So it is not hard to argue that facebook’s negligence is already causing significant harm to our democracy.


This is how you moderate your feed on Facebook: if you don't like someone and what they have to say, unfriend them. Problem solved.


My personal feed is not the issue here. The issue is how reactionary material keeps popping up in the feed of people who are susceptible into rallying behind it. The target area is so vast the you are bound to influence a lot of people. And the nature of the material only belonging to personal feeds makes scrutinizing by third parties impossible. All of this culminates into a really dangerous propaganda machine, especially since the platform is used by people who only go there to keep in touch with friends and family.

Not moderating facebook is criminally negligent and is putting our democracy in a woeful danger.


> My personal feed is not the issue here. The issue is how reactionary material keeps popping up in the feed of people who are susceptible into rallying behind it.

Wait, how do we know you are not susceptible to rallying behind the content in your feed?

Has the content in your feed been approved? Only cat videos and family pictures?


Sorry, I should have been more accurate: “No single user’s personal feed is at issue here”.

Sure if everyone would magically know whom to block and what to ignore, then we wouldn’t have an issue. But that’s not the case here is it? If a large number of people get constant posts with reactionary content without any scrutiny or fact checks we wouldn’t know any better then to get behind it. That is kind of how propaganda works. And that is exactly what is going on in the personal feed of millions of facebook users.

All we’re asking is that facebook will lower prominence of false or hateful content, add fact checks with lying content and counter reactionary content with scrutiny. Facebook is being grossly negligent by allowing the current state to continue, and—given how pervasive the platform is—facebook is currently harmful for our democracy.


Assume that Facebook is based in Alabama and 90% of its employees are conservative Christians. The employees could think its criminally negligent to allow FB posts on abortion and LGBT. You think only you're right, they think only they're right.


I see your point, but I don’t think its entirely accurate. First of all, LGBTQ+ rights are human rights, speaking against those is kind of sick and anyone does so is wrong and should be scrutinized. In a world where facebook doesn’t see that, then it is our duty to protest them and hopefully get them shut down for spreading hate speech.

But that is only a shallow—albeit correct—take. A deeper take is that you must consider the quality of the content and the pervasiveness of the lie its spreading. A anti abortion content that just states “I believe every fetus has the right to life” is not not as harmful as content that calls every person that has had an abortion a “murderer” or spreads some falsehoods about what the act of abortion entails. In this flip reality facebook would have the full right to add fact checks on content that would say something like: “There are no negative consequences to abortions”, or “If you think you are straight, you can fix that by calling...”.


> My personal feed is not the issue here. The issue is how reactionary material keeps popping up in the feed of people who are susceptible into rallying behind it.

What you are basically arguing is that people are too stupid to handle free speech.


Please don’t put words in my mouth...

If reactionary speech gets scrutinized or if false speech gets a fact check it doesn’t make it any less free does it?


This reading is not the most charitable, but what I read is basically this:

You personally do not have a problem, but other less smart people are susceptible to this effect, and because each person is seeing only their own feed no one of their betters can check up on them.

That sounds to me more like arrogance and exactly the kind of issue people rebel against when they mention they don't like experts.


Sorry, that was my fault. What I meant—and what I should have said—is: “No ones individual’s feed is the problem”, but I felt it was stronger to take a personal stance.

Giving the advice that you can be you can curate your own feed, and that is somehow a solution to the fact that millions of people are bombarded with reactionary misinformation is disingenuous at best.


I have used the “mute this person for 30 days” feature quite a lot lately. I wish Twitter had it.


I'm still not 100% clear on how the 2016 election was influenced on facebook. Both American parties say the Russians interfered in order to help the other side. Liberal FB employees are upset about Trump, while conservatives think they are being censored on the platform.

Can you point me to any clear, neutral, unequivocal research that illustrates the effect, including impact? Did it swing the election to Trump, or try to steal it from him? How does this compare to traditional political media advertisements?



Read this post by the Facebook VP - https://www.facebook.com/boz/posts/10111288357877121

As much as I hate Trump, many people are salty that he won and want to put a blame on something. That's it.


Where is his detailed analysis wether targeted political fake news articles based on personal data could flip swing states?


You're asking a variation of 'When did you stop beating your wife'



Isn't "slippery slope" considered a logical fallacy? Until fb deletes something and by doing so clearly causes harm to society, let's not assume that any-moderation-whatsoever is evil.


The Cambridge Analytica scandal already proved to be harmful for democracy. False (and illegally financed) advertising on facebook is also believed to have influenced the brexit election enough to swing the vote. The current (lack of/bad) moderation is already causing harm to society.


There is no scandal if you read Boz's (FB's VP) post on Cambridge Analytica. Of course, very few people are going to believe it since it came from Facebook itself. It's mainstream media vs Facebook on who gets the eyeballs.


I would love to see a set of (enforced) policies that are blind to the user in question, if such a thing could be. For example:

1. If a post is flagged for review, the account name/owner, social score, social graph, ad revenue, all of it, is hidden from the review person, team, or algorithm. The only history they would have access to is violation history, influencing some kind of recidivism penalty.

2. The duration and severity of the consequences of the violation is applied to the account automatically, and the "judge" is unaware of any backlash from the user / community.

3. Any protest or appeal is sent through the process, blind again.

I realize this is isn't nuanced or realistic in the slightest. What about mistakes? What about gamification of moderation processes? What about revenue impact? How could #1 be cleaned enough to fool anyone...

Still.


What about revenue impact

Write that in cursive on the black board 100 times before you are allowed back in your cubicle, employee #140932.


Can I ask an honest question:

For those of your that aren't satisfied with his response, what do you want Facebook to do?

I haven't really been following this particular news story but I've been seeing a lot of anger against Facebook here and even on my LinkedIN feed.


Decentralize the platform and secure personal data so data can't be brokered.


As someone who doesn't use facebook at all, this reads as so hilarious. Zuck knows, in this "difficult moment", his internal employee memo would leak, so he makes it public outside the walled garden.

Hey Mark, let us know when you have something concrete to say to us peons.


oh good, they’re reviewing their policies. Worth just about as much as an internal investigation at a police department.


The business of fb is measured in MAU and DAU. These are their key metrics. The only things that drives these metrics are engagement. It’s basically the core of their business. And controversial items generate the most engagement. Unless it’s crossing the line in black and white terms for example like a beheading and most items don’t do that because the world is shades of grey they are good for business as they have the potential to generate more engagement. It’s basically good for business. What’s good for business may not be good for society. Sometime it is. Sometimes it’s not. Some people are ok with this and some are not.


As much of a shit show FB tends to be, it's still a personal choice to create an account and/or use the one you have. Put simply, don't use it. (I know, easier said than done, especially for the people that rely on FB to stay in contact with family and friends.)

We can live without FB as difficult as it may seem for some. We did just fine prior to the social media paradigm and can certainly do without the hate and vitriol promulgated through social media platforms, especially FB. Let the down votes begin!


almost every paragraph in that post starts with "we will" or "we have". I'm sure Mark actually meant to say "I will" given that he more or less has unilateral control over the entire community of 2.5 billion people. The world's largest autocracy if you will.

Wild propopsal, if the community and in this case the voices of African-American members matter so much, why not give up some of that decision making power


This contains much too little discussion of the feed algorithms.


Delete your Facebook account. Anyone that matters will be able to figure out how to get a hold of you. It might minutely inconvenience you for a week, but you'll be much happier thereafter.


Except a bunch of people who matter to me are not as comfortable with technology as HN's readership and for them the easiest way to get hold of me (or anyone else) is via Facebook.

> Anyone that matters will be able to figure out how to get a hold of you.

I used to think like this in university and even a few years after but having grown my social circle beyond tech, I now find this line of thinking snobbish. On the contrary, since these people matter to me, I don't want them to have to figure out how to get hold of me. There should be a default option to get to me and for most of them that is, unfortunately, Facebook.

Yes email didn't suddenly stop working, IRC is still a thing, there are a lot of other apps out there that respect your privacy, etc., but these people buy the Android phone that fits their budget, sees that FB is installed by default and keeps in touch with that. And the UX of Messenger is just, hands down better and more personal than any email client their phone comes with.


It's not snobbish. Your way of thinking is insultingly infantilising.

I'm sorry for the disparity between your post length and mine, but there isn't anything else to say. You view the people around you as fundamentally incapable. "How ever did those poor, stupid assholes figure out how to talk to each other before us technical people came along?"


I don't view the people around me as "fundamentally incapable". I am saying they are _unwilling_ to adopt a mode of communication other than Facebook because of Facebook's _ubiquity_. And I think it's a big mistake not to acknowledge how much this ubiquity factors into Facebook's grip in the social media market.

You are saying: "Delete Facebook. It's easy, you will be happier". And all I am pointing out is why this is such a gross oversimplification of the issue to the point that it does not really contribute a viable solution.

> "How ever did those poor, stupid assholes figure out how to talk to each other before us technical people came along?"

I don't see how my post comes across as something as condescending as this. You misread me greatly.


It's incredible that he and his army of PR folks actually thought this post would convince people they are good guys doing a lot of hard work to fix those issues by now.


> If a newspaper publishes articles claiming that going to polls will be dangerous given Covid, how should we determine whether that is health information or voter suppression?

That's a really interesting and tricky question. Now remove the word "how" and think about it again...


He said that instead of flagging, they’ll take it down entirely. Guess that’s a step forward.


It is very apparent that Facebook is wanting to avoid appearing to be an "editor" of content in almost any shape or form in an effort to reduce their chances of being regulated by the government.


I'm surprised that the general sentiment on HN right now seems to be that Facebook should act gatekeepers, something that traditionally has been one of the major fears.


This reminded me a little bit of a writing-pedagogy course I took, where we had recurring discussions about the difference between being a gate-way and a gate-keeper.

These discussions circled around the power and responsibility we'd have as instructors to simultaneously help our students obtain the important credential on our side of the gate, and on the other hand to protect the value of that credential.

There wasn't a one-size-fits answer to this question; the point was to make us wrestle with the values involved and come to defensible conclusions. We had to think about what was inside the gate, what was outside the gate, and when it was our responsibility to stand between them.

Unfortunately, it doesn't seem like many people running these platforms (not to point fingers; I was just as blind) realized something like "As a <world leader>, I want <to incite violence> so that <someone else might do crimes>." was missing from their user stories.


Facebook has _always_ moderated content. Fairly substantially, much more than Twitter, say. For instance, if you want to post a picture of yourself with no clothes on on Twitter, Twitter is basically okay with that; Facebook certainly isn’t. The current controversy is over whether public figures should be treated as entirely special, or just mostly special.

Look, Facebook is within its rights to ban the average person for whatever it feels like, while still letting important small-handed personages lie about elections. That’s perfectly legal. But people are certainly entitled to question their ethics over it.

Of course, even the arch-censors over at Twitter allow important small-handed lying; they just reserve the right to point out that it IS a lie, and very occasionally do so. I’m kind of amazed that people are so upset about them doing the bare decent minimum.

By the way, I’m not defending twitter in this; it would be better if they just applied their rules equally and deleted the accounts of important people, whatever their hand measurements, on the same basis that they delete normal accounts. But at least attaching disclaimers to lies and abominable speech is progress of a sort, I suppose.


A lot of freedom of speech policies and preferences, including US criminal law, make exceptions for speech that is likely to incite imminent violence. Look up the "true threat" doctrine. I'm not surprised that HN sentiment has a similar exception.


I am pretty sure US free speech exceptions do not prohibit threats by government to deploy the military against people, even if that is the very embodiment of a true threat that is likely to incite imminent violence.

But HN community is not against that. You can have a news article about Nation X declaring intention to send an army to against Y if people think Y deserve it. People in general love violence and encourage it when it is the right kind of violence against the right target.


Yeah, most speech that relates to violence is neither illegal nor necessarily opposed by HN commenters (and of course plenty of us disagree with each other on what we want to tolerate). The exceptions are a lot narrower.


Targeting own population with military is unconstitutional though, with just a few exceptions.


We can define Nation X as the US and Target Y as US citizen and go down different scenarios and see when it becomes the right kind of violence.

A peaceful protest (this should be universally agreed on as a very bad target for military use).

Rioters that burn cars and loot stores.

White supremacists marching in the street.

Exact same actions as any above but we call it terrorism.

US citizens doing terrorism but which is indistinguishable from regular crime.

US citizens doing terrorism, where military weapons like bombs are used (we could also split this up based on what reason the terrorist have).


X=Y

Target is not people, but actions in specific time and place.

Measures need to be justified and scrutinized.


When people talk treating everyone equal and banning speech that incite imminent violence, they don't talk about how they want exceptions when it is justified and scrutinized to incite imminent violence.

What is justified is also very subjective, an aspect which a lot of people brings up in discussions. Who get to decide when violence is justified? Is it the people, and if so, how should should we count the votes?


It is not decided by direct democracy and not by dictatorship. If people involved are unable to act responsibly, they are the wrong people for the job.


If we didn't allow speech that might incite violence we'd still have slaves and women wouldn't have rights.


Agreed. The true threat exception to the First Amendment, which can indeed lead to a legally and constitutionally valid criminal conviction in the US, is a lot narrower than "might incite violence".


People were also upset at FB for not censoring Trump's post about mail-in voting. What imminent violence was it about to cause?


I can't speak to that one. I wouldn't want that one censored.


A reasonable person might assume that people have some core moral axioms and their political positions flow out as logical derivations starting from those axioms. This isn't actually the case in reality.

People's political positions are mostly formed via induction from the pundits of their tribe, whether it be individuals like John Oliver or Sean Hannity or newspapers like the NYT or the WSJ, which in turn form their positions through a complex calculus of tribal affiliations, but never based on axiomatic moral values. The values are only retrofit to provide a veneer of logic to the process. These positions then flow down the influencer pole until the average Joe is virtue signaling about them to his 5 friends.

Not only this, there isn't any feeling of cognitive dissonance when these political stances clash, because they were never really about a logical moral system to begin with.

Thus we see people saying both that:

* "Freedom of Speech" is great but FB should censor individuals based on the speech fashion of the day.

* We should "believe all women" but only if they can lead to the potential cancelation of Brett Kavanaugh, not Joe Biden.

* You were a terrible human being for spreading COVID if you so much as left the house 2 weeks ago but now you are terrible human being if you don't go out and congregate in groups of hundreds and thousands for the protests.

The mistake is assuming there is a logical thought process as opposed to mere tribal affiliation and signaling.

Growing up, I was taught (not in those words, but by implication) that burning books was one of the most symbolically powerful and worst non-violent acts an oppressing regime could do. It's saddening to see that metaphorical books are still being burnt, but by a different set of people.


The general fear is that it will lead to an authoritarian dystopia. Which may or may not be the case. We have however seen a glimpse of what happens when you don't do it and it's an authoritarian dystopia. So gate keeping is avoiding the more certain evil.


This situation happened because FB is complicit, not because they didn't do anything.

They optimize for ad revenue, not for decency or the human angle. FB is basically the Matrix and we are the ad revenue batteries powering it.


My grain of salt meter just kicked in, here's hoping.


Zuckerberg wasting our time as usual. Hope he can shut the eff up. Obviously he is one of the main weaknesses in our democracy. I hope he gets eaten.


I'm not a fan of Facebook or Trump, but if I were writing the rules for governing a platform designed to connect people, then refusing to censor the elected president of the US would seem like a sensible rule.

But it's not a sensible world at the moment.


"We have work to do" in the last paragraph is exactly what he's said in every public apology for this or that.


That's painfully PR-ish even by PR standards. Lots of words not much substance. Review this, review that...


"We're going to review"

"We're going to review"

"We're going to review"

"We're going to work on"

"We're going to review"

There's an old joke in Washington - when you want to look like you're doing something without doing anything at all, you commission a study (you review it).


I hope there's a FB meme "We're going to work on shipping that feature" and "I want to be clear that while we are looking at all of these areas, we may not come up with changes we want to make in all of them."


Lots of ideas, few commitments. I guess this is positive.


That’s a lot of words for saying nothing.


A whole lot of review of policy and very little clear commitment to action.


Facebook is a dying platform, and due to its enormous size and momentum it may be a decade before its downward trajectory is fully appreciated. Facebook will also fight like all hell to remain relevant, and the cheapest way is making empty pronouncement like they can help "heal the divisions in our society". Helps that they can never be made accountable for it.

Why do I think its dying? Generative youth culture has long left it for other social media platforms. In the US even the middle aged suburban demographic is getting tired of the Karens and Boomers spewing their hatred or turning what could be valuable online community spaces into their culture war battle ground. Facebook is very good at driving engagement in people that are susceptible to conspiracy theories, extremist views on any spectrum and generally excels at turning people into assholes that would not be tolerated in civil society. Just because its statistically easier to get high engagement metrics from people that are susceptible to fear tactics and have low critical thinking ability, their algorithm hyper focused on feeding them exploitative content. To be honest Fox News perfected this on an older medium. Facebook claims ignorance and plausible deniability: all they are doing is maximizing the daily-average-user analytics.

Its no wonder that it was the birth platform for Zenga and genre of games that exploits dopamine reward system. The games industry was too obvious about it and now cultural opinion turned them. They'll never go away, much like slots and casinos. Facebook is going with the same playbook but substituting shiny coin rewards with confirmation bias newsfeeds and viral factoids.

Only sadder is that most of the executive team has no idea what their critics are talking about because they have largely no social acumen, empathy or wisdom. Really good at convention keynotes:"But look, we're also doing VR and AI! We can't wait to see what you do with it, our free developers!"... They can point to numbers going up in fancy real time dashboard, they get praise for shareholders and bosses. They don't understand the what kind of people they snare up to make their number work. People quitting their platform is a lagging indicator, at least by years, that they built nothing of lasting human value, but exploited flaws in human psychology. I could be wrong of course, cigarette companies are still going strong.


Could someone copy / paste the text here for those of us who have blocked *.facebook.com domains in our /etc/hosts file?


I just shared the following note with our employees, and I want to share it with all of you as well.

---

As we continue to process this difficult moment, I want to acknowledge the real pain expressed by members of our community. I also want to acknowledge that the decision I made last week has left many of you angry, disappointed and hurt. So I am especially grateful that, despite your heartfelt disagreement, you remain focused on taking positive steps to move forward. That can't be easy, so I just want to say I hear you and I'm grateful.

I believe our platforms can play a positive role in helping to heal the divisions in our society, and I'm committed to making sure our work pulls in this direction. To all of you who have already worked tirelessly on ideas to improve, I thank you. You're making a difference, and together we'll make a difference. And while we will continue to stand for giving everyone a voice and erring on the side of free expression in these difficult decisions -- even when it's speech we strongly and viscerally disagree with -- I'm committed to making sure we also fight for voter engagement and racial justice too.

Many of you have asked what concrete steps we can start working on to improve our products and policies. I want to share more about the seven areas I discussed at Q&A that we're focusing on initially. Based on feedback from employees, civil rights experts and subject matter experts internally, we're exploring the following areas, which fit into three categories: ideas related to specific policies, ideas related to decision-making, and proactive initiatives to advance racial justice and voter engagement. I want to be clear that while we are looking at all of these areas, we may not come up with changes we want to make in all of them.

Ideas related to specific policies:

1. We're going to review our policies allowing discussion and threats of state use of force to see if there are any amendments we should adopt. There are two specific situations under this policy that we're going to review. The first is around instances of excessive use of police or state force. Given the sensitive history in the US, this deserves special consideration. The second case is around when a country has ongoing civil unrest or violent conflicts. We already have precedents for imposing greater restrictions during emergencies and when countries are in ongoing states of conflict, so there may be additional policies or integrity measures to consider around discussion or threats of state use of force when a country is in this state.

2. We're going to review our policies around voter suppression to make sure we're taking into account the realities of voting in the midst of a pandemic. I have confidence in the election integrity efforts we've implemented since 2016. We've played a role in protecting many elections and now have some of the most advanced systems in the world. But there's a good chance that there will be unprecedented fear and confusion around going to the polls in November, and some will likely try to capitalize on that confusion. For example, as politicians debate what the vote-by-mail policies should be in different states, what should be the line between a legitimate debate about the voting policies and attempts to confuse or suppress individuals about how, when or where to vote? If a newspaper publishes articles claiming that going to polls will be dangerous given Covid, how should we determine whether that is health information or voter suppression?

3. We're going to review potential options for handling violating or partially-violating content aside from the binary leave-it-up or take-it-down decisions. I know many of you think we should have labeled the President's posts in some way last week. Our current policy is that if content is actually inciting violence, then the right mitigation is to take that content down -- not let people continue seeing it behind a flag. There is no exception to this policy for politicians or newsworthiness. I think this policy is principled and reasonable, but I also respect a lot of the people who think there may be better alternatives, so I want to make sure we hear all those ideas. I started meeting with the team yesterday and we're continuing the discussion soon. In general, I worry that this approach has a risk of leading us to editorialize on content we don't like even if it doesn't violate our policies, so I think we need to proceed very carefully.

Ideas related to decision-making:

4. We're going to work on establishing a clearer and more transparent decision-making process. This is clearly not the last difficult decision we're going to have to make, and I agree with the feedback from many of you that we should have a more transparent process about how we weigh the different values and equities at stake, including safety and privacy. I think we can provide more transparency into what goes into the policy briefings and recommendations that get sent to me. These analyses are done thoroughly by Monika Bickert's team and take into account many voices. Since I accept the team's recommendations the vast majority of the time, this process is where I think we should focus most on transparency. For the most sensitive escalations where I discuss with the team further rather than just accepting their recommendation over email, we can try to outline how we incorporate all perspectives into those follow-up discussions as well, even though that tends to vary depending on the equities at stake in each decision.

5. More broadly, we're going to review whether we need to change anything structurally to make sure the right groups and voices are at the table -- not only when decisions affecting a certain group are being made, but when other decisions that may set precedents are being made as well. I'm committed to elevating the representation of diversity, inclusion and human rights in our processes and management team discussions, and I will follow up soon with specific thoughts on how we can structurally improve this.

Proactive initiatives to advance racial justice and voter engagement:

6. We've started a workstream for building products to advance racial justice. Many of you have shared ideas in the past few days on product improvements we can look at, and I've been impressed by how quickly we've moved here. I've asked Fidji to be responsible for this work, and Ime will be shifting some volunteers from our New Products Experimentation team to focus on this as well. They'll have more to share on the first set of projects we're planning to take on soon.

7. We're building a voter hub to double down on our previous get-out-the-vote efforts. At the end of the day, voting is the best way to hold our leaders accountable and address many of these long term questions about justice. Our efforts will draw on lessons we learned from our successful Covid Information Center in order to make our voting and civic engagement efforts as central as our efforts around Covid recovery. We'll focus on making sure everyone has access to accurate and authoritative information about voting, as well as building tools to encourage people to register to vote and help them encourage their friends and communities to vote as well. In 2016, we ran one of the largest get out the vote efforts in history. I expect us to do even better in 2020.

To members of our Black community: I stand with you. Your lives matter. Black lives matter.

We have so far to go to overcome racial injustice in America and around the world, and we all have a responsibility and opportunity to change that. I believe our platforms will play a positive role in this, but we have work to do to make sure our role is as positive as possible. These ideas are a starting point and I'm sure we'll find more to do as we continue on this journey. I encourage you all to also check out Maxine’s post about how you can give direct feedback on product, integrity and content policy ideas as well. Thanks for all your input so far, and I'm looking forward to making progress together over the coming weeks and months.


Thank you!


Why not just unblock it to read? It’s right within your sphere of action


Anyone else get a huge pop up with Zuckerburg's smirking face asking them to sign in?

If facebook is going to release a PR statement to the world/general public, can they at least remove that sign in pop up for that statement page?


[flagged]


As a non-user of Facebook (and a non-American), I'm curious to hear what kind of critical decisions favored conservative supporters? It seems like giving any voice to conservatives is considered a PR suicide right now. I'd be surprised if FB did that.

(whether this is only within the tech world or a broader point of view in America will soon be discovered come elections time)


Here is one recent article on the topic, but google around FB has been hiring Republican operatives to try to quell the abuse, among other things.

https://www.theguardian.com/technology/2019/nov/03/facebook-...

As a US American and non-FB user, I can say it is fairly obvious in the way Republican politicians attack them, and how they respond. There's a strategy Republicans use with the press, and it wouldn't surprise me if it isn't obvious to those who haven't seen it play out before.


You haven't been paying much attention then

https://money.cnn.com/2016/05/18/media/facebook-conservative... https://www.businessinsider.com/mark-zuckerberg-holding-priv...

tl;dr conservatives bitched that social media has a liberal bias and social media moguls started listening. Same thing with twitter.


They don't do as much censorship of conservatives as liberal censors would like.


TL;DR mostly a bunch of promises about policies they're going to review. Few decisions made yet.

But I do appreciate "Your lives matter. Black lives matter."


Have you noticed how corporations with dubious histories are falling over themselves right now to tell people "your lives matter and we're listening"?

It's nothing more than opportunistic grandstanding.


It will only 'heal divisions' if you are fair to everyone and do the right thing, instead of listening to demands of the mob.

As an example, saying you want to murder conservatives, white people, the president, and anyone not progressive should be banned from the platform.

At the moment, you say something that can even be interpreted as hate in an indirect way toward a minority group or person and it's an instant ban.

You can say you want to outright murder white people, cops, or even the president (I've seen all of these myself over the past few weeks) and it's not only allowed, but retweeted/shared and shared millions of times.

Until this changes, it's not healing. It's giving special treatment to a minority group and allowing them to bully, harass, and threaten their perceived enemies in the name of equality.

The mainstream news simultaneously tells us that it's only 'white people' that are the ones agitating, rioting, and looting and then also saying that 'rioting is the language of the unheard'.

I really don't want to live in a dystopian future like this and will do anything in my power to fight it.


> I have confidence in the election integrity efforts we've implemented since 2016. We've played a role in protecting many elections and now have some of the most advanced systems in the world.

You have to be f kidding me. Voter suppression (asking for paid fines, takin out poll places in particular communities), gerrymandering, foreign interference etc etc


Heh, what election did FB protect? Which election was more legitimate due to the existence of FB?


Exactly




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: