Hacker News new | past | comments | ask | show | jobs | submit login
Facebook Pledges $130M to Fund “Supreme Court” for Content (wsj.com)
100 points by otterley 41 days ago | hide | past | web | favorite | 147 comments

In fairness, here in the UK at least, the print media are going nuts about Facebook and Google failing to stop fake news and politicians' lies. Coincidentally they're bashing their competitors for something they themselves do just as much, make of that what you will...

That is one reason fb (and maybe Google too and twitter and whoever else) are donating: make an independent sock puppet, they can decide and you just point at the when people complain. They're a "sin eater".

Its not different to how music and movie companies have banded together in the past to deal with "moral" crusades. I guess this is like regulatory capture technically, but where you build the regulator yourself. "Regulatory farming" maybe?

Personally I'm very dubious about fb (or anyone else) setting up an "arbiter of truth" and deciding who can advertise and what they can claim. But I prefer to think it's a marketing/political ploy than a dastardly plot to control our reality. They make 1000 times more money selling us sugar and then diet pills than political bs, but the political bs is upsetting that market.

The mass media is going nuts about not controlling the narrative anymore, that's why they want to bury big tech. They will lose in the end, so I don't know if it's worth it to fight.

Big tech also wants to control the narrative. Due to the nature of their platforms, it's much more difficult to do but that hasn't stopped them from making a mess trying.

Big tech just want to suck in money in peace. They only suppress content to ensure that peace.

They cut out sexual content first because it's biggest nono in the US where they come from. Bigger than nazis or racists. But then people got railed up about other things. Nazis, racists, transphobes, mysogynists, political lies.

Peaceful harvest becomes harder and harder.

No wonder they want to outsource all that noise. They want to be given clear direction which part of the field is safe and which is the part of shifting sands of public outrage.

Zuckerberg is literally just begging to be censored. Censor will set up clear boundaries of safe field current companies can share and keep the status quo since new player won't be able to attack them from the direction of fuzzy boundary because there won't be any.

I think each time Big Tech tries to control the narrative more and more is a step in a direction that is short sighted.

I get wanting to pull down literal Nazis and I mostly agree. But scope creep on who is a Nazi and the ever vaguer "hate speech" will spell true trouble as they get even more unclear.

The reason this is happening is becoming more clear. Despite what tech types like us believe and echo to each other on forums like HN, there is the reality of an apathetic environment in which most people live.

Most people, most content creators, most content consumers, most people in general, really don't give 2 mosquito dicks about any of the issues I think are important.

Privacy from the government? Nope. If you're not a crook, you've got nothing to fear.

Tech companies having too much info about you that they will share with the government? Ditto.

Nazis can't get free speech? Who cares. In fact, GOOD, I don't want that crap popping up on my kids' recommended list anyway. Good job YouTube!

Lately I have been beginning to understand the magnitude of the problem. Apathy is far more powerful than any government.

And the worst part of all? The only thing they have to do to fuel apathy is keep the trains running on time and keep the walmart/grocery store shelves stocked. That's all.

I'm becoming convinced that this problem is being attacked wrong. We need to stop talking about Nazis and politicians and start talking about some group that is actually respectable that may be shut out. I don't know off hand what that content would resemble. But I do know that almost no one likes politicians, nazis, and pedophiles. So it's making me believe that continuing to have those groups attaching themselves to our arguments is counterproductive at this point.

Surely there should be something disconcerting about being massively disagreed with, and finding that you can't imagine any cause more respectable than pedophiles and fascists?

I agree with you that these things are important, but why do you think so if you can't formulate a good argument for them that doesn't rely on peoples' sympathy for Nazis?

Because I'm of the opinion that Free Speech is important to protect speech that most people do not like. Not really to protect speech that most people have no problem with in any case.

Why do you think it's important if not to protect speech that most people do not like? I mean, if it's speech or behavior that no one has a problem with, why does it need your protection?

I think that line of reasoning is wrong, because it implies that social media has resulted in some kind of "pulling back the curtain". What it's really shown is that, while the media might be able to spin lies people believe, the only thing that kept that power in their hands exclusively was infrastructure. What happens instead when "nobody controls the narrative" is that people go pick their own narrative, and not in a way that's necessarily any closer to the truth.

I don't think it's about pulling back the curtain, I think it's about there being another curtain.

The internet, even "big tech" platforms with 1000 different ML algorithms for downranking and banning things, is just a different medium than broadcast media altogether, so there's bound to be tension between tech companies and broadcast media, even ignoring the whole ad money thing. There's so much going on at the same time and there's just no presumption that any particular piece of content is authoritative or popular like there is on TV or in a newspaper.

To put it another way, it seems like it's not just about the difference between the framing of the "big story" on the NBC evening news and on TruthS33k3r_42's channel; it seems more about the difference between having or caring about a single story at all, and perhaps just chatting on Discord/forums/Twitter/Messenger about something completely separate.

Murdoch and sons seem to have figured it out. They take the false narratives spun out of social media and rebrand it. In fact their job is probably easier these days since the work of coming up with the spin has been outsourced complete with A/B testing, so they just need to editorialize.

Sadly, I think you’re right. Traditional media is hemming and hawing about this mostly because now there is someone else who is doing what they did for so long which was to set narratives with little pushback.

Now they’re losing grip on relevance and they don’t like it one bit.

For a lesson, see their reaction to Richard Jewell vs All The Presidents Men. They like being seen as heroes, even if the facts are stretched, but if they are shown less gloriously, they complain about the fictionalization aspect...

"Print media" are news organizations though, right?

Facebook and Google are platforms, analogous to a printing house. They let anybody publish their opinions, without endorsement.

You wouldn't go nuts at HaperCollins for printing Mein Kampf, or a bookstore/library for distributing it.

And yet, the experience of a social media feed is much closer to a newspaper than a publisher's catalog... Why do publishers and news papers have different sets of responsibilities, by the way?

Not really. You can quite selectively choose which authors you want to read and which you don't.

That's nothing at all like a newspaper, where everyone gets the same issue.

> Why do publishers and news papers have different sets of responsibilities, by the way?

Because that's how the world works: different people have different jobs.

The newsstand/ISP delivers information which was put in readable form by the printer/website platform syndicated by the news agency/subreddit and authored by the reporter/person.

> "arbiter of truth"

They should just call it Ministry of Truth or even Minitrue as a nod to the Orwell.

"Regulatory farming" is a great concept. There is a very similar concept called "regulatory shopping", where institutions can choose the (existing) regulatory agency that they want to be regulated by.


Why not just go Twitter's route and ban political advertisements? Then you can still sell the money-making sugar and diet pills without the poison of politics.

It's not a bad idea, I'd actually favour it myself because "screw all this noise", you know?

But there are a few issues.

First, practically, you now have to decide what is "political". Is an ad for abortion services political? What about the NRAs latest membership campaign? Or a gun safety campaign? Lots of things are political to at least someone. So who decides what's political? You're still in something of a mess. And you technically have to ban ads that are entirely factually true now!

Similarly, even if you can stop the ad next to a celeb tweet being political, can you stop the same buyers paying celebs to Tweet fake news directly or Russian bots creating 10,000 accounts and tweeting fake news? Will you go through @Potus tweets and cut out the fake news? Suddenly the Liberal media is eating trump babies.

Second, it doesn't solve your core problem: traditional media, from the guardian to fox, will now attack you for being anti free speech (AND for every time an ad gets through your filter, that looks like it shouldn't). They don't actually care about whether you publish fake news. They publish it. They just want you to die. Plus now you've offended your freedom loving users, and alienated the politicos relating you. Maybe under the 1st amendment US politicians can't FORCE firms to take their ads. But they can audit them, drag them I front of Congress, cut net neutrality so ISPs eat their profits etc.

Ultimately, its down to users to filter their media and fact check it. It always has been. There was never a golden age where newspapers reported facts in order of importance and politicians were honest and policy fact driven. But no one wants to admit that. Facebook made that mistake, it was reported as "Facebook will publish fake news, eat babies, fuck your wife"

Fwiw, the US has rules on foreign contributions to political campaigns which have already explored these questions in courts, up through the Supremes...

While this is irony at its finest, I actually think this is a good idea.

We expect FB and Google to exercise editorial control but they can't for two reasons. The first is the narrative we have subscribed to as a society is that to avoid legal liability because they are "platforms" aka common carriers of other peoples content.. Clearly they didn't make it. Yet, FB and Google can clearly do content regulation when the law requires it (exhibit a.. Nazi propaganda in Germany or illicit images) and even where the law does not (exhibit b.. posts or YouTube content with background music being muted due to copyright claims). Cable TV providers clearly censor ads and content as well. Some of that is upstream of the local cable tv delivery service but in essence they all operate in "cahoots"

The second is the slippery slope is that enforcement in the case of Google and FB is the threat of anti-trust investigations This is the "the Sword of Damocles" that hangs over them. So setting up an independent tribunal is a way to maintain a suspension of disbelief.

> A corporate trust managed by financial services firm Brown Brothers Harriman will in turn oversee the board’s budget and administration

Any details on how board members will be appointed, vetted and selected?

This looks like Facebook creating a red herring it can blame for the consequences of policing its platform.

from the article,

> While Facebook will select the first 11 members of the board, the eleven will choose the remaining board members, with terms lasting three years.

facebook picks the first 11 board members and then those 11 staff new appointments for 3 year terms, but not sure if the first 11 also control staffing throughout the org.

if this is a "facebook supreme court" then its kinda fishy if the stakeholders get to appoint the first 11 judges...

edit: also, tsk tsk wallstreet journal. if you use numerals (11) in a piece of writing, dont switch to the written number (eleven)!

> its kinda fishy if the stakeholders get to appoint the first 11 judges

So it's a Facebook-controlled board.

An honest mechanism would involve outside groups from the start. A legitimate mechanism probably requires the government to step in.

This is simply a deflection mechanism, a board for PR.

> if you use numerals (11) in a piece of writing, dont switch to the written number (eleven)

This is common style advice. The numeral represents a quantity. Written out, it represents the body per se.

Frankly i agree about gov intervention in such matters because how can you really, honestly take a company at its word that its self-appointed accountability board is actually independent? Or has power to make decisions that would hurt Facebook. And companies shouldnt really be expected to do that, or at least we shouldnt expect that over a government solution. Something like this is exaccccctly what governments is designed to handle vs what business can be expected to do on its own.

> While Facebook will select the first 11 members of the board, the eleven will choose the remaining board members, with terms lasting three years.

So Facebook reserves its right to dictate who serves on the board, whether directly or indirectly.

The millisecond I clicked on this article, the first sentence looks very strange:

> Facebook Inc. will pay $130 million to establish an independent board charged with reviewing how the company moderates its content...

First of all, isn't it ironic that the owner of the largest social networks in the world is attempting to fund a independent court / board for content with Facebook money? If they're not the only ones backing this, then both them and private corporations funding this whilst trying to maintain this institution's independent nature is going to have lots of trust issues.

Sounds like this is the establishment of the Facebook "Supreme Court" for Content.

> a independent court

Facebook picks board members who then pick more board members. The sole financial backer is Facebook. At a whim, Mark Zuckerberg could ignore, re-constitute or disband the board.

It's far from independent.

This is about the time when I pine for a higher fidelity knowledge of history so I could make a pithy comment about the similarities between the reign of the Candy King and ....?

Alas I cannot, so I'll just continue to foo(); bar(); baz(); my way along.

Hey I am a Facebook employee and I am really confused by all the comments here. Really feels like a "Damned if you do/Damned if you don't" situation. Can anybody actually suggest a better alternative? Not saying there isn't one, I am just not seeing one being proposed.

> Can anybody actually suggest a better alternative?

Ban non-geographically targeted political ads. Create a public repository of all political ads.

Creating a puppet entity and calling it a court is dishonest and insulting.

(The only proper solution, of course, is breaking up Facebook into, at the very least, Facebook, Instagram and WhatsApp. Concentrated market power is the fundamental problem.)

> Ban non-geographically targeted political ads.

This actually sounds like a good idea. Ads about issues pertaining a whole region should be seen by the whole region.

> Create a public repository of all political ads.

Doesn't this exist already? https://www.facebook.com/ads/library/

> The only proper solution, of course, is breaking up Facebook into, at the very least, Facebook, Instagram and WhatsApp.

Not saying this wouldn't be good, but how would this help with the whole content moderation debacle? It looks to me like Twitter, YouTube, etc has the same issue. This specific problem doesn't sound like something that would be magically fixed if there was more competition, though I might be wrong.

I disagree strongly with closing off an entire sector of content to political advertising. It seems like the goal, or at least the unintentional effect, is to reinforce the traditional media’s monopoly on political discourse.

Traditional media political ads can't micro-target people that are more likely get sucked into their made up "facts" and conspiracy theories.

They also are broadly broadcast so that they can be dis-proven and discussed. Instead they are hidden away only shown to people that won't question it.

If facebook wants to do political ads then it needs to disconnect them from targeting, publish them for public display, publish who paid for the ad, and moderate them to some basic level of being truthful.

> people that are more likely get sucked into their made up "facts" and conspiracy theories. If these people are so easy to fool and manipulate, why do we allow them to vote? If all it takes is some post on Facebook to change their votes, there will be dozens of other easy ways to manipulate them. Are the other ways better because they more opaque or tend towards one ideology or another? Almost all of these arguments boil down to "People are dumber than me so I should be able to control what they see". Dress it up however you want but ultimately this is about feelings of intellectual superiority and a desire to control outcomes in favor of those who wish to control the narrative.

Not dumber, but easily influenced by lies. Similar to how scammers prey on the elderly.

> Traditional media political ads can't micro-target people

They may not "micro-target", but they can definitely target them.

Any political campaign will tell you they do their damnedest to ensure their ads are running at the right times on the right TV/radio stations.

Honestly, hours of traditional media itself (Fox, MSNBC) likely does far more to bias opinion than handfuls of 30-second ads.

Do you think direct mail should be banned too? That is the original microtargeting platform, and is a $44bn industry (so only recently surpassed by Facebook in $ terms)

> Do you think direct mail should be banned too?

It should, but only because it's incredibly annoying.

I’d put it differently: targeted ads allow campaigns to do more with less by focusing on people who are more likely to be receptive to their messages. That’s a valuable thing that shouldn’t be prohibited.

> targeted ads allow campaigns to do more with less by focusing on people who are more likely to be receptive to their messages

Fair point. I suppose targeting could be balanced against a public cool-down period.

So you can randomly target people in the district now. Or, make your targeted ad publicly available for N days before it goes out. This would let the public see it, check it, and potentially respond before the damage is done.

No censorship. Just a head start for the public if a campaign wants to target.

> I disagree strongly with closing off an entire sector of content to political advertising.

I feel like you are inadvertedly misrepresenting OP's point. The OP specifically referred to geographically targeted ads, which have already a long history of being manipulated to subvert the Democratic process by foreign actors with disastrous consequences, and instead you somehow are talking about "closing off an entire sector of content to political advertising"?

It might not have been your intention but your post sounds like a disingenuous strawman.

OP said “ban non-geographically targeted ads.” Besides, the “long history of being manipulated ... with disastrous consequences” is a [citation needed] assertion. Certainly, the thin support for that position doesn’t overcome the fact that it makes total sense to allow targeting ads by geography. There is no point in Trump running ads in Baltimore, or Warren running ads in rural Virginia.

> OP said “ban non-geographically targeted ads.”

Yes, and you've made a gross misrepresentation by equating it wi "closing off an entire sector of content to political advertising." It's a very disingenuous take on what was actually said.

> Besides, the “long history of being manipulated ... with disastrous consequences” is a [citation needed] assertion.

Really? In this day and age are you still trying to turn a blind eye to the elections in the US and the UK?

> Certainly, the thin support for that position doesn’t overcome the fact that it makes total sense to allow targeting ads by geography.

You're again coming off as very disingenuous by misrepresenting what was actually said, as the OP specifically talked about non-geographically targeted ads and you in turn decided to talk about the exact opposite.

Genuine question - where is the concrete evidence of harm from the interference in the 2016 elections? We hear a lot about how Cambridge Analytica destroyed the political process, or Russia... But what are the facts in evidence for harm?

Put another way: how can we determine actual harm caused, and be sure that it isn't blown out of proportion? (IE Cambridge Analytica playing up their capabilities to get new customers)

How would breaking up FB into Facebook, Instagram and WhatsApp help with this at all? They don't even share ad inventory. It looks like you're out to hurt the company and that's it.

> How would breaking up FB into Facebook, Instagram and WhatsApp help with this at all?

Moderation policy is an axis of competition. Letting each platform explore moderation systems (a) let’s us compare results, (b) reduces the damage a single company’s bad policies can inflict, (c) and gives users on both sides alternatives if they disagree.

It's called punishment, and it helps to discourage negative behavior.

Ah, okay. I thought we were looking to solve a problem, instead we want to inflict vendetta because we don't like something.

> Assumes FB causes no harm because we've yet to articulate the problems it causes.

Look through my posts in this thread. FB causes shitstorms of damage not the least of which is aggregation / indoctrination of dissention through isolation.

FB is basically pushing monsters outside of its gates in hopes they "go away". They wont. lol.

Also there are legal practices call "punitive damages" so yes there exists legal statutes for punishing things that we determine we dont like.

The problem is that leaves you open to e.g. Car companies putting out ads that promote their gas guzzling cars while banning climate change ads that might warn of the consequences of using such vehicles.

If you do what Twitter is now doing[1], "restrict rather than outright ban", you're back to having to serve as an arbiter.


> while banning climate change ads that might warn of the consequences of using such vehicles

My comment suggests no such banning. The ads simply couldn't be micro-targeted. They'd also need to be made public, so the public, journalists and other interest groups could review, vet and respond to them.

I have no obligation to use any of these services, why do they need to be broken up? Google is a far more threatening power over my day to day activities.

Even if you don't use these services, they use you. The tools these companies provide to different websites to use will collect data about you. So just visiting a website you are affected. You seem to recognize how Google affects you, but the FB products do as well.

So in that case you are essentially using their products, because their software and services power lots of other services. There are alternatives, I just don't care enough to pursue them. Who cares if someone has tracked my data I like having the internet the way it is.

They all have an enormous effect on your life whether or not you use them. Yes, Google should be broken up too.


Facebook's immense power makes it much more difficult for competing messaging systems, social networks, VR-related companies, and others to enter the market. When they do find any success, Facebook usually acquires them. The effects of monopolies and oligopolies on economies and societies are well-established.

If you live in the US, or Europe, or pretty much any country Facebook operates in, the company is spending millions of dollars every year to lobby your government. While that may not sound like a lot, political influence is surprisingly cheap, and effects you whether or not you choose to participate in the political process. The CEO meets regularly with the president of the United States and his close associates, and has met and spoken with many other world leaders.

Facebook is a major facilitator of violence and ethnic cleansing around the world (https://time.com/5197039/un-facebook-myanmar-rohingya-violen..., https://www.reuters.com/article/us-facebook-india-content/fa..., https://www.buzzfeednews.com/article/ryanmac/chinse-media-fa...). You might argue that if they couldn't use Facebook, people would just use some other platform or tool to accomplish the same goal. That's possible, but if it's happening on Facebook, the company has a responsibility to take action, and they have failed. I don't think there is any other platform that could be used as effectively as Facebook and WhatsApp for this purpose.

Even if you don't use Facebook, they likely have a profile for you if anybody you know uses it. They collect contacts from peoples' phones, and scan for faces in all photographs (and correlate the results with other photos). If you don't have robust ad and tracker blocking in your web browser, Facebook is tracking your behavior with their ubiquitous like buttons and analytics scripts.

> I have no obligation to use any of these services, why do they need to be broken up?

Because they are a dragnet of private information that is used specifically against your best interests, both private and public.

I can sympathize with that feeling, but you sort of are in a "Damned if you do/Damned if you don't" situation. No one is expecting the wolf to stop eating sheep, and as admirable as its proposal to fund a shepherd might be, we would be negligent if we didn't doubt its sincerity.

If I have friends or groups on facebook, I want to see what they put up. If I don't like what they put up to the point that it really bothers me, they won't remain my friends or groups.

You're policing speech. It hasn't historically worked well.

I would prefer that Facebook lose enough power and influence that this is no longer necessary, probably by being broken up.

Have content reviewers be employed (not contracted!) by this third-party organization with independent editorial control over the content they're reviewing, using this "Supreme Court" to ensure that their editorial decisions are just and sound, and to provide Facebook a forum to request changes in editorial policy without the implied "do whatever we say or else we'll fire you and replace you, you cogs" that the reviewers suffer under today.

How about an independent third party (outside the company) that has a binding contract with Facebook? This third party could handle other social media sites as well that choose to opt in.

Personally, I'd rather just see Facebook stop being such a huge force in the market and instead move toward a decentralized model like Mastodon with ActivityPub, but there's no way Facebook is going to go that voluntarily, nor is there any reason that they should.

So, at the very least, they could make this organization completely outside of Facebook's oversight, kind of like how the ESRB works. Maybe this is their end goal, idk, but if they choose to retain control, people are always going to be skeptical.

Here’s my comments and I hope to be as constructive as possible:

1) I listened to Zuckerburgs podcast on free speech. I felt that while it had the appearance of adhering to the principles of the first amendment I think the purpose of the podcast and the reasoning of its participants is that the first amendment in facebooks eyes shouldn’t be the only consideration.

For me personally, I am close to what con law people call absolutist. The first amendment means what it says. However the first amendment says nothing about tv, radio and the internet (for obvious reasons). That’s where I am willing to extend my definition to be first amendment + Supreme Court precedent. This is the standard I view as the gold standard of individual liberty. I stopped using Facebook last January as an experiment / New Years resolution to focus my time more effectively but I would say after hearing the podcast, intellectual and well meaning as it was, the positions presented are completely incompatible with my view of individual liberty. I may decide this new year to simply deactivate my account altogether for this reason.

2) there is a growing trend of companies acting like governments. For ex, look at the surveillance powers created by nest/amazon. Look at the efforts to use private satellites for sub 1m resolution surveillance (formerly the only application was defense/intelligence related). And now look at Facebook and Twitter deciding on new rules for speech/content. Facebook’s Supreme Court is a way to manage the complex arguments for going beyond our own legal system. It is borne, in my opinion, both as an means to enact rules to curb individual liberty but also to subvert the existing expensive and lengthy process of taking something to the Supreme Court. And that’s just in the US ... never mind countries that don’t have the legal infrastructure to decide these issues.

In short, this is Facebook deciding it can hack / disrupt government. There’s no coincidence this is happening at the same time that it is introducing libra.

I am not a citizen of Facebook. I’m less than a citizen. I’m simply a physical embodiment of a life whose digital representation is owned and marketed by a private company. And now it wants to govern my digital representation’s speech and money. And I don’t single out Facebook. How long before I can subscribe to a private Nest police force?

Government is slow. Government is inefficient. Government is corrupt. As far as I’m concerned governments sole purpose is to protect my individual liberty. What is facebooks sole purpose?

My solution is do not use Facebook.

> My solution is do not use Facebook.

How effective is that? Doesn't Facebook collects data on everyone's online behavior even if they don't have an account?

It may be that FB collects data on everyone they can. The subject of this article is about content/content moderation. If I do not produce or consume content on FB then I might not care about or be subject to any moderation practices.

or instagram, or whatsapp, or messenger.

This could be easily solved if moderation records were made public for public posts. Keeping the process opaque will never quell doubt about its impartiality. I know if I saw a record of what post was banned, which policy it violated, any appeals process records, and a list of previously banned posts by that user, I would be able to feel confident that the system was being managed well, and a strong defense for FB and its private judiciary to demonstrate its application of company policies.

There's no "better" alternative really - if Facebook censors content - which it absolutely does and which it will be forced to do by politicians even if they didn't want to - then Facebook is also responsible for these censorship decisions. Making a sock puppet to deflect responsibility is a transparent move that is not going to fool anybody. It makes situation slightly worse as it insults the intelligence of the community by implying this transparent trick could work, but otherwise everything stays the same.

In theory, Facebook could figure out a way to have truly independent panel of experts than are non-partisan and aren't beholden to Facebook in any way, but I have no idea how they could do it in practice, and even if they did, it'd backfire on them very quickly as soon as that panel would make a decision that some activist group wouldn't like, as those activists would surely target Facebook and Facebook would have a choice to either annihilate the credibility of the panel by ignoring its decision or face the wrath of activists which the Facebook have proven many times in the past they can't withstand and will have to cave anyway.

So I really don't see any positive outcome here. Once you're in the censorship game, you're in the censorship game, with all that involves.

For anyone curious, this is the same Pesenti as the one that founded Vivisimo.

Some of the people replying to you are a bit...ich, but:

Why not give the funds to someone battle-tested and with a good reputation like John Gilmore to distribute for the purpose? Something can't be independent with as much Facebook control over it as the post implies.

If this is to be serious, there needs to be absolute transparency about how it is independent.

As it stands, there is zero evidence of this. If Zuck is displeased with it, then he can just not contribute more once the initial money is used.

For starters, Facebook could offer a paid version of Facebook that allows people to completely opt out of all ads. I would be happy to pay $4 a month for this option (or maybe something like $30 a year), even though I use adblock to block all your ads anyways.

Also, I would really like to just see my activity feed from all my followed friends in straight up chronological order, with no filtering, dang it.

The way you all display stuff right now is obviously designed so you can sprinkle ads into the feed as if they were actual content. But I don't want to see ads. At all. Period.

I don't blame you guys, and don't envy the position you're in.

Near as I can tell, the mainstream media has spent years telling conservatives that their voices are being silenced at a disproportionate rate by left-leaning Facebook employees. At the same time, they're telling liberals that Facebook is deliberately allowing the spread of disinformation to help more conservatives get elected. Almost as though someone wants the left and right united in their hated of social media.

The next step is obvious: pass laws to cripple the major social media companies and restore the mainstream media to its former position of unchallenged power.

If you don't like the smell then you could simply stop selling (political) ads. Facebook made the decision to enter the advertising business, this is just par for the course.

> Can anybody suggest a better alternative?

Yes, quit attempting to police speech.

We're still policing reading.

DRM on digital books, "renting" books on digital platforms, banned books by governments, banned books on shopping sites, religions sentencing and/or cursing the "unclean" reader, communists banning religious books....

> Really feels like a "Damned if you do/Damned if you don't" situation

Maybe this is something that you (=FB) should have given some thought before navigating yourself into that situation. Finding a solution for this is hardly anything that outsiders should have to do. Solve the mess yourself and better do it fast.


That crosses into personal attack and is not ok here.

If people get harangued for showing up and mentioning their employer, that will only disincentivize them and others to participate on topics they know a lot about. That will make HN strictly worse.


Yes. Realize that FB is the defacto platform for social discussion and realize that discussion at this scale is the singular path towards cohesive integration of multi-culture societies.

Stop policing speech/thoughts/ideas/communication.

Nobody has asked facebook to be the arbiter of social-morality.

You at facebook are creating an echo chamber and driving dissenting voices away because they dont adhere to the 'woke' dogma.

You're taking all the monstrous ideas and pushing them outside. You're giving them no option but to aggregate + further radicalize each other.

This is why shit like 8chan has gotten so wretched.

Theres already WAY too much involvement + dogmatic oversight of 'moral right' by facebook.

Stop doing what you're doing and accept that if you're ACTUALLY interested in the mass-culture integration you claim you're working towards, banning + policing thoughts aren't the way towards that ends.

What actually needs to happen is for the bad to be extrema to be buffered though interaction w the socially normal beliefs.

What facebook is CURRENTLY doing is radicalizing groups.

Somehow zuck cant see this as the repressive puritanical system that it is.

You're denying the existence of a problem, not offering a solution. Implicitly, this says that you don't value the opinions of people who very strongly believe that there is a problem. It's not a great place to argue from.

> You're denying the existence of a problem, not offering a solution

Actually, I believe @unearthed acknowledged a problem, and asserted the best solution was to do nothing.

More than just this though.

Because FBs take is so uninformed, its going to create greater social rifts.

LITERALLY contra to what its stated objectives are.

SO. WHY would zuck do this?

lip service to his employees.

> You're denying the existence of a problem, not offering a solution

Incorrect. Im saying the solution IS NOT the dogmatic oversight that facebook is pursuing here.

Instead the solution is: "The droves of people using facebook are sufficiently large that an organic zeitgeist can emerge without interference"

Surely You're NOT suggesting:

"Yes we require the dogmatic, bureaucratic, oversight of speech to create harmony"

> We require the dogmatic, bureaucratic oversight of speech to create harmony

This is an open, legitimate debate. It's complicated with Facebook due to its lack of competition.

Companies moderating speech on their services is time tested and likely necessary. If the market presents a diversity of services, no one system of moderation gets a stranglehold on public discourse.

Facebook, however, is dominant. As a result, its decisions approach the breadth of government censorship. That's a fundamental disconnect they're trying to paper over.

> It's complicated with Facebook due to its lack of competition.

Twitter, Reddit, Snapshat, Marco Polo...lots of competition in the social media space.

Is there a successful FB clone, a la Google+? No there is not. That does not mean FB somehow owns social media.

>Companies moderating speech on their services is time tested

Not at FBs scale

> Companies moderating speech on their services is time tested and likely necessary.

Here's the thought experiment validating what Im claiming about driving out dissenting views:

Where do the ideas that are pushed out of facebook go?

Do you think they just go away? Absolutely not.

The problems is zuck is culturally beholden to his droves of socially 'woke' employees. Nor does Zuck strike anyone asa persona strong enough to do anything other than 'tow the line' of that narrative.

Socially, hes a lap-dog.

Which is why this 'Supreme Court' is being established. Homeboy cant shoulder the weight (in the court of public-opinion) of the moral judgements being made, so hes throwing $130 million so someone else's name sits at the top of that list.

IMO the only thing this is doing is creating monsters + pushing the work of social integration down the line. Its a bad short-term strategy.

Atop the list of problems here is that facebook loses oversight/engagement w the most dangerous of outside thought-leaders. Inadvertently, putting themselves in the dark.

Many, many people believe that the hands-off approach has, in the fullness of the history that brings us here today, demonstrably failed.

The argument is that we have more violent extremists, not less, and they've been quite successful in using large public platforms to recruit and radicalize new members. The "organic zeitgeist" has had plenty of time to do its thing, and it really hasn't.

You're offering a straw man argument, by way of false dichotomy: kafkaesque hellscape vs total freedom happyland. It's not helpful to the conversation, imo, and doesn't actually engage with the issues at hand. It's polemics, not a solution.

1) Chill w the superflous language. Nobody is impressed with how much debt you've saddled over your phd.

2) You cite nothing substantiating these claims.

Leave it to a woke phd to misrepresent a claim to win an argument.

> total freedom happyland

No you dumb fuck.

Im saying that the process of people sorting out a collective, integrated social morality is going to be complex.

Where the fuck did I say it was going to be easy?

> kafkaesque

Yes, its a big deal. How long do you think the british people are going to put up with their daughters being raped and people being stabbed? Do you think that "acceptance" for Khans woke bullshit will continue into perpetuity? Do you think there's some really mean fucking people who are basting in anger + isolation because they've been pushing out of pleasant society?

Does such a thought seem plausible at all? What do you think happens before a coup of any kind? Someone capable sits it isolation + gets really fucking pissed off.

This is the problem with having discussions with arm-chain intellectuals in a space where they can behave with impunity.

You're happy to attempt to win an argument by misrepresenting my claims. The intelligencia of the bay area is absolutely reprehensible -- and I know, I live here with this bullshit.

Fuck, would I love to have these discussions face to face.

> kafkaesque

Lets sit down and get a coffee and chat over this shit and well see which of us is more capably evil.

Obviously you can't post like this to HN. We've banned this account for breaking the site guidelines.

If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future.


stop censoring and dont listen to hacker news

dont listen to hacker news

This is critical advice for a lot of topics.

Sometimes you paint yourself into a corner and no good options remain.

Trust is hard to earn and easy to lose.

Stop trying. This isn’t your problem. Wether or not Brexit is a Soros plot is not something a glorified BBS needs to worry about. Should AMD be pressured into adding logic to block people from making deepfakes?

I know there is a tendency to be consumed with your own importance combined with political heat, but the best course is to develop great filtering tools for users and “Let the people decide.”

I don't know about you but me and my family like living, you're going handle all the political heat with just a sturdy front door?

The "is this the platform responsibility?" debate is long time over and everyone agreed (EU and USA) that it is FB's responsibility. States in which Facebook operates have regulations about internet/advertisements/fake news, and FB must find a solution to comply globally with all regulations.

If it was only Facebook, FB would accept any ad as long as someone is willing to pay to display it.

What state has laws about fake news?

How about facebook spends some money fighting for a better government. Right now the billionares and corporations are running stingy charities where they decide how the money is spent and how much money is spent and this mostly in their self interest. It would be more in the public interest if facebook and corporations just handed over their fair share of taxes and got out of the way of the government. They should also respect the laws and regulations in other countries where they do business. Quite frankly, in my country of Canada, we have regulations on news in order to keep them honest (no fox news bullshit here!). But facebook is increasingly become a news outlet and so it must start to be treated like one.

Are we supposed to believe that the people receiving the $130M will be somehow independent and make decisions without influence from Facebook? That doesn't sound very plausible.

> The money marks a significant investment in an organization that doesn’t yet exist, but could take on responsibility for some of the company’s thorniest decisions.

I hope what doesn't happen is that this turns into a way for Facebook to delay/confuse/distort how decisions are made, and ultimately give Facebook itself a way to shirk the blame.

That's exactly what this is. They're trying to avoid regulation of any kind, and "third party audits" are historically a good way to keep the federal government out of your hair for at least a measurable period of time. The financial industry learned this quickly.

This actually does feel like a step in the right direction. Having _any_ sort of appeals process in place is definitely a step up from the Google/YouTube approach of: you can be banned by an automated process for no apparent reason and if that happens there's absolutely nothing you can do about it except raise a stink on social media and hope your story gets enough traction that someone capable of doing something about it notices the problem and fixes it.

Facebook already has an appeals process for content. This just adds another level, and I'm fairly dubious it'd be any less capricious.

They don't want to be held responsible for moderating content, so… they're creating an organization funded solely by themselves. How is that any different from just doing it in-house?

It worked well for the movie studios and MPAA. The MPAA ratings board has the veneer of being independent without actually being too independent.

The MPAA is an Association. It's a small government for studios. It's not intended to be independent, it's intended to balance power along its members. If there was only one studio it could do ratings in-house.

It is likely that an organization like this will become a defacto standard used by all big social networks as a way to get government off their back, much like the Hayes Code and then the MPAA ratings board by the movie industry.

Because it lives outside of Facebook and can offer its services to other online communities. It's a smart play if your goal is to own governing worldwide speech.

The amount of money required to adequately do what the world wants facebook to do might exceed the amount required to fund the 3rd party org

If the org is not as competent as people wish it to be, the blame is somewhat insulated

This sounds like the Hays Office.[1] That was the Motion Picture Association of America's censorship unit from 1934 to 1968.

[1] https://en.wikipedia.org/wiki/Motion_Picture_Production_Code

They should name it Goskomizdat

"Ministry of Truth" has a nice ring to it [1].

[1] https://en.wikipedia.org/wiki/Ministries_of_Nineteen_Eighty-...

> While Facebook will select the first 11 members of the board, the eleven will choose the remaining board members, with terms lasting three years.

Call me skeptical but that's not enough to distance yourself and set up a "fair" court. Facebook still has lots of influence and I doubt they'll recommend something that would truly hurt Facebook's bottom line.

How is the board "independent"? Just calling it "independent" does not make it "independent." It sure sounds better, but "Facebook Creates Star Chamber of Moderators to Rubber Stamp Decisions It Wanted to Make Anyway" doesn't sound as respectable.

Facebook: a place to share things approved by a board of corporate trustees.

The wolves voting for free range sheep.

Why does facebook continue to allow the running of political ads and all the headache that comes with it (despite it being a minuscule piece of it's overall revenue)? Power. You can see this as being the most tone-deaf way possible to address the problem, or as a power grab to become an arbiter of truth.

The right-wing has been claiming anti-conservative bias in big tech for a while now. Banning political ads, even if done across the board, would be likely to ramp that up dramatically. https://www.theverge.com/2019/8/6/20756734/trump-google-anti...

For example: https://www.marketwatch.com/story/republicans-criticize-goog...

> President Donald Trump’s reelection campaign and other Republican election groups criticized tech giant Google on Tuesday for making it harder for political advertisers to target specific types of people. The GOP groups said the changes will lead directly to suppressing voter turnout and would “disproportionately” hurt Republican candidates.


Said right-wing currently controls the Senate, the White House, is appointing large numbers of Federal judges, etc.

It's not shocking for Facebook to want to avoid worsening relations with people holding significant power to regulate them. No one really wants to be the subject of a Trump tweetstorm, either.

They're just trying to avoid a ruling like this one: https://www.mtsu.edu/first-amendment/article/583/pruneyard-s...

for social media platforms.

What will be the equivalent to the Bill of Rights, on which this new court will base its opinions?

The forest was shrinking but the trees kept voting for the axe as its handle was made of wood and they thought it was one of them.

In the absence of any context, adages like this don't add anything to the conversation. Every reader who encounters it will nod and say "yes, that's so true of the other side".

Thankfully there's the context of OP's post.

Hrm. I'm curious how this "Supreme Court" will interact with Facebook's cadre of paid content moderators and if that "Supreme Court" will be/will have the power to advocate for that group of workers--or if this is strictly going to be a single-serving organism in the company.

I'd certainly be less incredulous about this move were there answers to these inquiries.


I am not picking on FB here. I find it interesting humans created this network. It has some benefits and plenty of issues due to its business model. There is strength in the network but the network could succumb to a loss of participants and thus lose its power. Why are we not advocating for folks to simply leave the network? I know I am removing lots of the gray areas, but I am genuinely curious. Is there really no other group of builders that can provide the market with a FB alternative? I keep reminding myself that at one point in time AOL felt like the only access point to the internet that mattered...

"Supreme Court"..this sounds nothing like the Supreme Court.

So what happens is that Facebook chooses 11 people and gives them money so that they could make Facebook censorship decisions look like they don't come from Facebook. But since they people in that "court" still are appointed by Facebook, that's all just a shell game. But now they can point at "independent verification" by people they chose to pretend there's some objectivity behind it (even though there can't possibly be).

Wolfs are funding supreme court over fate of sheep.

I can't wait for this brave new world where you will need to implement a $130 million supreme court to start a new social network.

$130M sounds like a tiny sum for content arbitration at Facebook’s scale and the number of jurisdictions it crosses. I mean, nobody’s done it before, so it will be interesting. But I don’t think this is or could be a serious attempt to solve the systemic issues at stake with the platform.

Great! Corporate takeover of our commons. Sounds completely kosher to me.

They'll need to hire idubbz so they can have a proper content cop for the content court.

HN et. al. have become really quite cynical about FB these days, but I suggest reality is something more pragmatic, however ugly --> yes, FB likes taking money from anyone, but I'll bet they really, really don't want to be in the business of content filtering, or having to make such decisions, and would rather wash their hands of it.

There's little pragmatic leadership coming from the powers that be over this stuff - obviously there's a good deal of noise out of governments concerning 'fake news' - but I'd argue it's much more difficult than one would imagine given 100 different nations, different laws, lobbying power, even aside from the concerns over really deciding 'what is fake'. I think a lot of TV spots might reasonably fall under the category of 'not exactly factual' as well.

If there were reasonable, clean and coherent direction from US gov (probably not going to happen), it would be easier. A lot of the EU thinking/legislation I suggest mightn't be specifically nuanced enough as well.

I am not cynical about this announcement. I'd wager FB would be somewhere in the range of 'happy' to comply with a set of reasonable bits of legislation, especially if was applied universally and coherently. In the absence of anyone actually doing anything, they're going to set it up themselves.

Impartial? Hardly, but it's probably better than nothing.

$130M is not chump change, this is a big deal. Perhaps wit a change in governance maybe something will happen. (I'm not making a political statement other than to say the Trump regime will never do anything about this).

Absent any real collective movement on this issue ... I think this is a positive step. It's better than the status quo.

No thanks!

Who is going to run this board? It's probably going to be the last people on earth you want in those positions.

Damned if you do, damned if you don't.

> Damned if you do, damned if you don't.

Actually, it sounds an awful lot like "damned if you don't, damned if you pretend you do to keep not doing it".

And what exactly do you think "it" is? In terms of concrete actions for content moderation, not some vague "be good, don't be bad" moralizing.

Far better* than the alternative, a transparent process to decide what is acceptable content on social media that is managed by the government, you know, democratically. At least this way Facebook owns the process from the get go, it’s the purest form of ‘regulatory capture’.

Just imagine if technology companies had to adhere to publishing standard like broadcasters and journalists do, because of regulations. All that money from conspiracy theory political ads targeting niche social groups would evaporate. Social media platforms would be far less political influence.

*For Facebook.

In the United States of America we already have something like this, it’s called the 1st Amendment in the Constitution. Since Facebook enjoys the privilege of being classified as a “public square”, they do not have the legal right to censor content at all, unless the content is illegal (threats, etc.). This is not currently being enforced.

Imagine being so oppressed that your ego feels the need to down-vote this comment :(

1. We also have a Federal Communication Commission which regulates to prevent information broadcasters from abusing their power.

2. > Imagine being so oppressed that your ego feels the need to down-vote this comment :(

I don't have the ability to downvote replies. I notice too that your comment has not been edited, so you must have written it at the same time you originally posted before anyone could have upvoted or downvoted you. Why don't you do a quick edit, for appearances sake...

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact