Hacker News new | past | comments | ask | show | jobs | submit login

Facebook is removing content because they wanted to make money with a website, and the job of statecraft fell in their laps in the process. They didn't volunteer, they aren't qualified, nobody told them they were signing up to run a kingdom, let alone the world. So they don't produce lofty, philosophical, well-thought-out positions on free speech and human rights. They don't carefully manage the public square with the long term good of humanity in mind. They make the same expedient, best-effort, unenlightened, sounds-good-to-me decisions that literally every poorly qualified there-for-the-wrong-reasons ruler in all of history has made. Your majesty, people are dying because cats are vessels of Satan! By royal decree, kill the cats. That's all this is.

I think they are trying, and I wish they had a Thomas Jefferson on staff and saw themselves as needing one. But they don't.

They chose to do algorithmic timelines.

They could do a chronological timeline. They could choose to not do any editorializing. They chose instead to get into the content promotion business.

Their active choices lead to this result

It's a much bigger problem than than merely content promotion and aggregation.

There are countries where Facebook is the entire internet. Maybe they tried to make it happen and maybe they didn't - but however it happened, they are effectively the Ministry of Information now for that country. They probably shouldn't be. Everyone agrees on this, even them. But like the child-kings of history, it is still the situation. Avoiding the job is simply doing it badly.

Google, Twitter, others, are in a similar boat. They find themselves effectively running branches of government now, not of countries but of the world. Nobody has ever done that before. Nobody knows how to do it correctly, least of all people that mainly want to make money on a website.

The censorship is not because they're evil. It's because one of their subjects approached them saying their stuff is getting people killed and they were like, "Aaugh, I'm just a web site! I don't deal in such matters! Get it off me!" Because they think they're a web site. A political philosopher or an expert statesman would weigh philosophically how to maximize freedom and minimize harm while understanding that freedom gets some people killed, with an eye towards avoiding atrocities. Facebook isn't thinking about the historical risks associated with censorship because they still think they're a web site.

They don't know how to drive this thing - nobody does - and they are going to drive it straight into wars and catastrophes and atrocities. At least, that's the risk. There are countries whose actual ministry of information communicates matters of public safety primarily through Facebook. Or hosted on Amazon. Or whatever. And that company is going to make some minor change intended to protect transgender people in Wisconsin or something, and accidentally massively empower an oppressive government in Africa. That's the problem space. You think you're miserable because people in California are currently effectively setting policy in Arizona? Imagine you live in India. Heavy hangs the head that wears the crown, and this might be one of the biggest crowns ever made. It would be nice if we could find a way to destroy the ring. I definitely don't see another serious answer. In the mean time, the situation is what it is. The rest of us are idiots if we sit around criticizing them for not being perfect. We should be figuring out how to help!

Regarding the Ministry of Information concept, I recently realized that one potential motivation for the large-scale end-to-end encryption agenda is to move the "keys to the kingdom" for the world's trillions of chat messages (and attached media) to end-user devices, *and* loudly signal that this shift has taken place by taking advantage of the very loud minority of security-conscious folks that implicitly trust and signal-boost E2E everywhere they find it. My theory for why this may have become existentially necessary was exactly the sort of "get it off me!" ideology you describe, except from the perspective of protecting from coordinated attacks by multiple cooperating nation states. Of course, all this now means general interest in pwning all smartphones everywhere is now that much higher :(. (Medium-size wall of text version at https://news.ycombinator.com/item?id=25522220, also referenced in https://news.ycombinator.com/item?id=27841760.)

Regarding perspective mismatches because Facebook "still thinks it's a web site", thought-experiment nitpick: do Facebook really believe this, or is their problem a) that they are expected to behave as a website by the very same governments they're (optimistically) trying to support, and/or b) that they're afraid to stand up and behave how everyone's pining for them to do because doing so would "rock the boat", so to speak, and trip all over large swathes of plausible deniability that are still comfortably nestled in shadows and grey areas?

I don't think Facebook literally thinks they're a web site -- I think they know something extraordinary is going on. Probably much better than I do. But I do know that what they set out to do, and what they found themselves doing, are very different, and the head shift has been difficult. This is why, when people come to them with questions and requests and complaints that seem a little more appropriate to ask a judge or monarch or a legislature, they make expedient and historically and politically short-sighted decisions. They don't recognize them as questions of political philosophy, opportunities to think about the right way to architect a global community. They don't even draw on the considerable existing wisdom we have about how governments should work. They fumblingly reinvent ideas from ancient Babylon so they can get back to doing what they think they do and what they want to do.

It's hard to blame them. The head shift is difficult. They didn't ask for this responsibility, and finding themselves at the wheel of something enormous they aren't qualified to pilot, shrink from the task. Any sane person would. I only write about it here in the hopes that we all can start taking these problems seriously. How should a global information community work?

It's not literally true that Facebook is a branch of government, either. It's just the closest analogy I have, and it's closer to the truth than where we started. Really, they are something new. Something nobody knows how to manage correctly yet. I am not the only one to understand that the stakes are high, which is why people keep passing the buck. Facebook wants the government to tell them what's expected and allowed, and the government refuses -- or acts in small-minded expedient ways, too. They don't know either.

This disinformation and censorship question is one of a hundred with deep philosophical and legal elements. What Facebook thinks about the value of free speech, how they decide disputes should be adjudicated, what they do in small cases of deception and fraud and political manipulation, actually matters a lot more to us practically than what our own laws say. This is a shame, because our legal thinkers actually have a lot more experience with being wise and just when it comes to such matters. Facebook has the power to set policy in a way no one ever has before, and they do not seem to be thinking in terms of implications for humanity. They seem to be thinking in terms of how to get people off their back because this sort of thing isn't what they do. Or perhaps more charitably, because the people coming to them with problems are so terrifyingly powerful and the problems are so terrifyingly high stakes. There have been complaints about minor changes in advertising policy throwing elections in far-flung countries. Would you want that responsibility?

This is why I say we need a Thomas Jefferson. What I really mean is, we need a political philosopher who sees the moment and its dangers and opportunities clearly, and can try to help us do things fairly and right. I don't know where you'd find one, though.

You're right that it isn't as easy as that -- people's expectations play into it. Governments', yes, and people's too. While I say Twitter should be creating a legislature and courts and a constitution, we'd all think they were insanely arrogant to do so. Even though now that I say it, you can probably see that it probably is getting to the point where that's needed. But they aren't ready for it and we aren't either -- inertia and expectations. A Thomas Paine could fix that. I don't know where you'd find one of those, either.

We are at a time of great change in history. The information revolution is in full swing. The internet had been a tool for research, then a toy for nerds, then a novel technology full of opportunity. Now it is changing the way society organizes itself. Some people think these digital communities will supplant nations, or exist alongside as something equally important. I don't know. I suspect the change is as profound as the one that ended the middle ages, when we transitioned from the church as a primary social organization to nations. But living in the middle of history, who's to say?

It took us a while to figure out how to run nations, too. We really shouldn't expect too much for a generation or two at least.

My main hope in writing this is to encourage people to take the problem seriously, and think of it as belonging to the space of political philosophy -- which it does. If you work for one of these organizations, learn something about the history and philosophy of government -- why we have laws, why we have courts, why we have elections, why we have bills of rights. You are running a community bigger and more diverse than any nation, with totally novel powers and limitations, totally novel ways to be unfair and evil that you need to avoid, and you do not have the resources to do it right, nor can anyone even tell you what doing it right even looks like. It has to be invented. Get help -- from history, from scholars, from anywhere you find it. If you find Thomas Jefferson living under a rock, hire him immediately.

The rest of us are living under the feet of a giant struggling while trying to carry far too many boulders. We yelp when he steps on us -- and we should -- but we should also be trying to figure out if we can build something that will help carry that load. The boulders are there regardless and the guy is hardly up to the task. Running massive communities fairly is hard and complicated. We need widsom and principled understanding. Attacking the overwhelmed monarch is not how you get peace and justice -- you get that by finding him advisors and engineers that can help him build it.

The situation as it stands is untenable. We need a better way. But until we have one, the situation is what it is and we should help the poor guy stuck managing it.

I think this subthread assumes FB is politically neutral. I don’t think that’s a good assumption.

Political neutrality is a charitable assumption, made because even in this (best) case, the problem is obviously so horrific and difficult that we will need everyone's help and cooperation (especially including Facebook's) to navigate it humanely. You cannot live in a society this large and complex without some give and take, and affection and good faith are the glue that hold us together through those moments of giving and taking. We need it to be true that Facebook intends to do right by everyone, even those they politically disagree with.

Fortunately for us, Facebook believes in its own political neutrality and very much wants us to believe it too. It is a conceit and a fiction -- I think we all know that -- nothing in the world is ever actually politically neutral, and the behavior of these entities is, at times, over the line of how close to perfection you might expect someone making a reasonable and principled effort to get. (An understatement, I know. I still choose to interpret what is happening as a sign of distress, not malice, on Facebook's part.) But we should nonetheless take their conceit at face value, hold them to that standard, expect them to try hard to get close to that ideal, and even assume they were trying and made a human error when they fall short and help and expect them do better. It is a tremendous gift that they are trying and think it is worth it to try. Even if that effort is not principled and sincere -- even if they're just trying to make others think they're politically neutral -- it's a massive gift. Run with that. Take it for all it's worth. Believe them and help them succeed and tell them that you're willing to believe they mean well if they act, not even perfectly, but reasonably. The alternative -- a world in which they cannot win and might as well not try, in which we have an openly and unapologetically politically non-neutral entity with this much power -- is far worse.

Facebook may have a lot of power, but it is not so much that going to war with half their countrymen is a good idea -- this is true both for them and for society. Perhaps they would be within their rights to do it, but the effect would be very bad for everyone. The predictable result of that scenario -- the establishment of a competing social media option with a competing flavor of censorship -- eliminates the possibility of healthy, democratic discourse. Our traditional media is in this state right now, which is why both sides are so useless. They cannot talk to each other. They can only talk to themselves. They cannot think high-mindedly about what is best for society. They can only talk about how wrong their enemies are. In an echo chamber and a war, everyone goes crazy, no matter how right they were at the beginning or overall. We need to be able to listen to each other, and choosing to fight instead would just hurt everyone. Loud bias on two sides does not moderate towards reason -- it accelerates towards two flavors of insanity.

We need each other. If we fight, all we will do is break the system, and -- as we see in traditional media -- this isn't worth it. As we navigate society-wide issues, it is easy to see that it would be really nice to have an information infrastructure that worked, but alas, in a short-sighted effort to win an election here or an issue there, we broke it. This is unfortunate because a functioning and rational information infrastructure would be a really helpful tool to have when you need to address a bigger problem like a pandemic. Even looking just domestically, the truth is that neither side needs to win half so badly as they need to be able to trust each other, and need the wisdom that they can come up with together. But of course, this information problem is not primarily domestic. It is revolutionary and it is global. We absolutely cannot afford to fight bitterly over stupid, provincial issues while trying to think about the sort of massive political and philosophical problem that is private companies in California pretending they are not running information infrastructure in Burma. Putting that infrastructure in the center of a US-based political war is a horrific idea to contemplate.

We can't go down that road here. Can't. And even if it looks like Facebook is going down it, responding by fighting is going down it with them. We have no choice but to figure out how to get along. We need high minded wisdom, we need the ability to listen to a huge, global community and make sacrifices for each other. We need excellent vision, and we need good ideals, and we need pragmatic political understanding. Affection and trust and good faith and cooperation are too important to sacrifice. We cannot fly apart, we cannot become enemies, we cannot take offense over small things. The only way to get to where we need to be is to insist on expecting everyone to be doing their best to get along and to accept nothing less. Facebook may be politically biased, but I refuse to accept that. To the degree that it is the case, they can do better, and I believe they want to, and I will help.

The cynical attitude may be true, but the deeper truth is that we cannot afford it.

A chronological timeline is still based on an editorial algorithm. A simple algorithm, but an algorithm nonetheless. It's no less editorial than any other algorithm, like (hypothetically) putting posts in alphabetical order or whatever.

Excellent post!

Yes, Facebook has found itself wielding power that it could have never imagined. Nobody is prepared or qualified for that kind of power. That's why it took 700 years from the Magna Carta until the US Constitution to formulate a reasonable contract to limit such power.

Now you have the most dangerous combination of all. The two ingredients which when combined has resulted in horrific atrocities.

The paradox of tyranny 1) The desire to make the world a better place 2) The power and means to actually do so

But Facebook is a private company & Facebook can do whatever it wants...And Facebook, being a Corporation, has the same rights as a person...

A glib argument? Sure. But this was the same argument used to censor other political speech. What's good for the goose is good for the gander...

The private company defense might become a moot point. There are numerous lawsuits now alleging these companies as state actors and some with actual receipts showing direct government communication and result action.

And just last week we have the white house essentially openly confirming state involvement.

> The private company defense might become a moot point. There are numerous lawsuits now alleging these companies as state actors and some with actual receipts showing direct government communication and result action.

Arguably, any entity that submits articles of incorporation to any jurisdiction is an extension of the that jurisdiction.

And in practice, it works out this way. Modern examples: CCP board members on top china tech co's and NSA PRISM integrations with top US tech cos…

Nice comment. I’m not sure that having a Thomas Jefferson on staff would be enough to make a difference though. The real Jefferson had one goal: build a sustainable democratic country that respects the civil rights of its citizens. Your Facebook Jefferson would still have to build an online system that prioritizes facebook’s financial health. Respecting the rights of users would be secondary.

> Your Facebook Jefferson would still have to build an online system that prioritizes facebook’s financial health.

Why do you think that?

Because that’s the way it is currently.

It’s not though. Companies function with significant amounts of waste that are related to internal fiefdoms and have very little relationship to the bottom line.

This might be the best comment I've ever seen on this web site.

All of this is also true for our actual government[0]. I have zero confidence that if our government were in charge of running Facebook/Twitter/any other social media app the results would be better.

[0] https://www.cnbc.com/2021/07/16/white-house-says-facebook-ne...

I agree. I don't think our government is qualified to make these sorts of decisions. I don't know who is. I am not advocating for nationalization of these services.

I am only saying that these people never signed up to have to make decisions about such impactful matters. They are not political philosophers, and they are stuck making decisions of that gravity. I can be mad when they do it wrong, but I can also recognize how tragically outmatched they are. At least governments have judiciaries and cabinets and checks and balances and constitutions and stuff. These guys were just trying to make money on the internet, and suddenly human rights in China became their problem. Nobody seriously expects Twitter to have a full blown judiciary and legislature for processing bans. Nobody expects them to write a constitution which becomes a treasure of a historical document, on how to properly govern the flow of the world's conversation. But at this point, those things would actually be appropriate. It's not surprising they're struggling trying to solve the problem with algorithms. I don't think anyone could succeed at that, and they don't even realize they have the responsibility and opportunity -- they're just trying to do their best to be socially responsible and then get back to making money on the internet.

The best suggestion I have for them is to hold out their hands to humanity and say -- "Look, we have a tremendous opportunity here, and it's bigger than just us. How should we use it?"

Facebook have tried this. They have made am independent governing board that theoretically can tell Facebook what to do w.r.t. content censorship decisions.

No surprise, it appears to be staffed by people who were selected for their middle-of-the-road "you must censor a bit but not too much" type of views. It gave them a limp wristed rap on the knuckles when they banned Trump, said it was arbitrary and didn't follow the same rules enforced on everyone else, but that they still agreed with doing it.

I think what we're seeing here is what happens when you lack some sort of free speech libertarian fundamentalism. Facebook don't have to engage in "statecraft", whatever that is, no more than the designers of SMTP did. They could choose not to. They could say "we will shut down accounts when under court order to do so, end of story". Then governments who think a citizen is breaking a law about speech would have to go to court, win, and then the judge would say, here is an order requiring Facebook to shut down the account of this law breaker (which could automatically hide all content they created). All the evolved mechanisms, the checks and balances of the actual state would be in effect.

But Facebook is based in Silicon Valley and like most firms there, has systematically made deals with far-left devils in order to hire them and put them to work, often without really understanding if it's worth the cost. Does Google actually need 144,000 employees for example? It hardly seems more productive than when I started there and it had 10,000. Their "hire first ask questions later" approach inevitably leads to hiring lots of extremists and wingnuts, people who are there primarily to get closer to a nexus of power they can wield for their own political agendas. The constant dramas we see emanating from Mountain View, Palo Alto and San Francisco are the inevitable consequence.

Tech firms could fix this problem very quickly if they wanted to: just announce a renewed commitment to freedom of speech, platform principles and passive moderation. Any employee who doesn't like it can leave. Many would, but those companies are so over-staffed they'd barely notice, and the environment for those who remain would be drastically more pleasant.

The problem with that is if Facebook committed to free speech then users would post a lot of offensive content which drives away mainstream advertisers. We've already seen that happen. Facebook tightened their censorship several years ago specifically because large advertisers were leaving the platform over concerns about their ads appearing next to user generated content that negatively impacted their brands. Obviously Facebook isn't going to do anything that puts advertising revenue at risk.

That's a rather fundamental flaw in their whole business model, isn't it? Advertisers who don't want their ads appearing next to user generated content, on a social network, have missed something rather important.

It's obviously not a flaw. Facebook is highly profitable. The vast majority of user generated content is inoffensive. We're just discussing a small minority of edge cases.

The government, in theory, is bound by the first amendment. FB being run by a government bound by traditional first amendment restrictions would be worlds better than what we have now.

This x100. People would flip if some people had true 1st amendment protection. Can't even give an example because HN would ban me.

If the majority of people here are software engineers, would it surprise people that someone has automated the job of crafting believable bullshit? Not to mention disseminate it faster and better than we have ever been able to?

I see it as a problem that we can iterate “content” faster, identify “audience groups”, run marketing analytics, A/b test “narratives”, all to craft believable, plausible “content” and then mass broadcast it.

We’ve built systems that create content faster and better than a normal human BS filter can block.

How does this have anything to do with the First Amendment? How do free speech rules bring the balance of power back to individual human levels of filtering?

Mind you the First Amendment is an American construct. It does nothing for things like genocides in Myanmar, or journalist suppression or hate crimes and the like.

> We’ve built systems that create content faster and better than a normal human BS filter can block.

Maybe so, but I just don't trust that the "BS filter" big tech has constructed will block only "BS" and not true things inconvenient to a certain strident brand of west coast morality. The NY Post story from last year certainly wasn't "BS".

I don't think algorithmic BS is anywhere near as big a risk as you think it is, and I think the risk of outright censorship is far larger than you imagine. Big tech should be a common carrier and viewpoint discrimination in moderation should be illegal.

And my answer to people who believe this is always the same - please volunteer your their time to an active subreddit of your choice, preferably one with an active political aspect.

Look, I am not unsympathetic. I have personally gone through the whole cycle - I started from "the antidote to bad speech is more speech, not censorship", To advocating for better tools to handle misinformation.

I would honestly LOVE for the world to work how I thought it did. I worry about the tools we create to clean our "gardens". Yet, without those tools I know most large communities would fail to be governed.

I am largely tired, of these debates on HN where old arguments are rehashed, un-tempered with empirics. Heck, people should be upset that the data that can answer these questions is under NDAs.

> please volunteer your their time to an active subreddit of your choice, preferably one with an active political aspect.

Why reddit? That site is a soup of immaturity, did you expect otherwise? On reddit you have people with 5 accounts creating conversations to control narratives and make it look like you have a larger group than you do supporting your case. This is more difficult to do on FB because they do a bit of account validation and other measures.

And your example of a political forum on reddit is a great example of why we don’t want censorship. On reddit admins ban for simply not speaking to the narrative or saying something that is not misinformation yet the admins don’t like. This is literally the same problem we have with FB now. The moral of the story is, stop censoring. It’s ok someone doesn’t say what you think is the truth, I promise the world will keep spinning.

How does that work? reddit speech is not worthy of being heard? Its too immature to be even worth experimenting with moderation?

If people are creating 5 accounts to create a narrative, well then thats the job isn’t it? Stopping those 4 fake accounts.

But that would be censorship. So what is your solution for the problem you yourself have described?

Plus, If your first option is to choose a forum where you self select out of dealing with messy problems, then your experience is invalid for guidance on dealing with messy problems - no?

I can’t seem to see anything but a contradiction of your own purposes here.

Perhaps you see how these things are not contradictory>?

> How does that work? reddit speech is not worthy of being heard? Its too immature to be even worth experimenting with moderation?

I didn't say it wasn't worthy of being heard, I said it's immature and there's zero cost to account creation and therefore you have literal children going to that site and spitting nonsense. Moderating that site is like herding cats.

> If people are creating 5 accounts to create a narrative, well then thats the job isn’t it? Stopping those 4 fake accounts. But that would be censorship.

Yup, banning them would. Did I say ban them? It was used as a reference to immaturity, because immature / crazy people create accounts to make a cohort that does not exist.

> So what is your solution for the problem you yourself have described?

In short, change the way these people are raised. Change what they're being taught in schools that enables this behavior. It's unacceptable to throw online temper tantrums when you don't get your way. It's also unacceptable to attempt to force those into your beliefs. So start by teaching children that instead of raising the entitled society that we have today.

For those, seemingly like yourself, who thinks the world will end if a group of people start believing the world is flat, I'd say learn to ignore people.

> Plus, If your first option is to choose a forum where you self select out of dealing with messy problems, then your experience is invalid for guidance on dealing with messy problems - no? I can’t seem to see anything but a contradiction of your own purposes here. Perhaps you see how these things are not contradictory>?

How about let me speak for myself and quit carrying the conversation forward using assumptions from yourself?

> We’ve built systems that create content faster and better than a normal human BS filter can block.

How does this work? Specifically the faster than a normal humans BS filter.

Without access to specialized knowledge, like being an expert in a subject, users will either check the authority/standing of the speaker, the emotional appeal of an argument, or the underlying logic of it.

In my simple way of putting it - creating a website, or creating a post, and then having it disseminated is dirt cheap today. You can get an article on a news website, having it referenced by a youtube channel, have that sent to a twitter feed.

That alone is sufficient to discuss an increase in the volume of content being created - however, that volume also gets disseminated as fast as it is created, which is what accounts for the speed.

This is also without looking into the fact that people tend to use superficial traits to assess whether information is credible online.

"Yet, research shows that people rarely engage in effortful information evaluation tasks, opting instead to base decisions on factors like web site design and navigability. Fogg et al. (2003), ... They argue that because web users do not often spend a long time at any given site, they likely develop quick strategies for assessing credibility."

From: Credibility and trust of information in online environments: The use of cognitive heuristics, (Miriam J.MetzgerAndrew J.Flanagin)

So a good looking website, with content that purports to be endorsed by known authorities, and hits the right cultural blind spots for its audience will get past their filters.

If you’re gathering information faster than your BS filter can process it, then you’ve also exceeded your capacity to process the information your receiving. Assuming BS filter means some extension of comprehension

Yes, I think so too. However people still will consume content as long it gives those dopamine hits. The brain makes you think you are doing something, even if its not really comprehending what its consuming.

Which is getting closer to the problem as I see it - the tech infrastructure has out scaled the default biological tools we are born with.

If you’re consuming information but comprehending it then this is listening vs hearing. This is my original problem with this “faster than BS filter”. If you claim that information is coming too fast for them to comprehend then they are not comprehending it and therefore not getting the information. Which would make this all false.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact