Hacker News new | past | comments | ask | show | jobs | submit login
Social-media platforms are destroying evidence of war crimes (economist.com)
253 points by CPAhem on Sept 26, 2020 | hide | past | favorite | 87 comments



I was trying to reply to a post that got flagged, so I'll repost at the top level instead:

I strongly disagree with [the other poster's lack of concern about war crimes], but I want to try to paraphrase the part of your intuition that makes sense to me.

Facebook gets criticized by its users and governments both for taking down "too much" and "too little", sometimes with regard to the same post or subject area.

Suppose user X posts something related to an armed conflict, violence, suffering, or death. This post might be part of a crime by X against Y, or evidence of a crime by Y against Z, or evidence of a crime by Y against Z, or not really a crime at all but just really disturbing and upsetting. Also, in various circumstances people might want records about violent crimes against them to be destroyed, or publicized, or not destroyed but not publicized (only used by some judicial process, or truth-and-reconciliation process, or historians, or something). Also, user X might have an intention that's different from the primary value or valence of the content, like prurient enthusiasm for violence, or making one of the parties depicted look bad, or trying to intimidate one of the parties depicted.

In order to figure out which category (or categories) a post falls into, Facebook has to (1) learn the language(s) involved in all posts, (2) learn the political context of violent conflicts, (3) perform some level of adjudication and fact-finding about political conflicts and disputes, and maybe even (4) try to understand the motives of the person who posted something on each particular occasion.

Is it reasonable to expect Facebook to do all those things, compared to other choices that are consistent, neutral, and inevitably result in various type I and type II errors with respect to the nature and purposes of posts related to violence?


> Is it reasonable to expect Facebook to do all those things, compared to other choices that are consistent, neutral, and inevitably result in various type I and type II errors with respect to the nature and purposes of posts related to violence?

I’ll grant that we’re in uncharted territory as far as the scale and depth of what a company like Facebook has to deal with on a regular basis regarding the amount and variance in the content that they see and manage. That said, Facebook constantly tells us how they want to change the world for the better.

If that’s not just marketing bullshit and they want us to take them at their word, then they need to recognize that the position they’ve put themselves in demands a level of accountability and responsibility that they’re clearly not comfortable with. Ultimately, citizens should be able to define what’s reasonable to expect out of Facebook, not the difficulty of a particular content problem. Becoming one of the largest tech companies in existence doesn’t grant one immunity from real problems involving their content.

If Facebook can’t manage their content at scale to address these kinds of issues, then perhaps they shouldn’t be managing content at this kind of scale.


The problem is much bigger than that. Can you tell the difference between terrorists and war criminals committing an atrocity, and a militia of freedom fighters delivering their brutal but understandable form of justice?

Can anyone?


Easy - ask the US government which ones are the bad guys. If you get too far away from where they draw the line between terrorists and freedom fighters, your platform won't be welcome anywhere in the western world anyway.


I think you're joking... but sarcam is hard to spot in text format

https://en.wikipedia.org/wiki/United_States_involvement_in_r...


Under my proposal, all justified military actions aligned with those in your link would be considered good and necessary, and censored only if necessary for military secrecy or to avoid alarming people. All atrocities committed by those aligned to other sides should be considered acts of terrorism, and of course it should be illegal to cover up evidence of such war crimes. Easy!


Terry Pratchett would be proud. He even formalised the universal battle cry: "remember the atrocities done to us to justify the atrocity we are about to commit!"

As would another, earlier British author. Can you remember when you were not at war with Oceania?


Yes. Sometimes it's not hard:

> In 2018 the BBC looked into a video circulating on social media showing soldiers blindfolding and then shooting two women and children in Cameroon.


Sure, sometimes you do get an obviously easy pick.

But what I'm talking about are the amateur snuff clips where a small group of men armed to their teeth and dressed in camo stand in front of tied, bound and kneeling young men, the latter of which are then executed, one by one. And it's only later you learn that the young men were serial war criminals and rapists who not only committed the acts themselves but forced their victims to commit them on their own families, before feeding them a hand-grenade brunch.

For us it's disgusting and horrifying. For the societies where it happened that's evidence of justice having been carried out.

And the worst of it? If material like that was filmed on a Hollywood studio lot, you'd give it PG, or "15" at best. After all, what's not to like in a some good old-fashioned vigilantism?


No one can. It all depends which side you support - the only option is to present both viewpoints in equal measure to 3rd party audience.


The problem is that different sets of people have different beliefs about what constitutes accountability and responsibility. For example, the article criticizes algorithmic moderation of terrorist content as an abdication of responsibility, but many groups argue that manual moderation is an abdication of responsibility - the EU is considering a law that would require Facebook to algorithmically moderate terrorist content.

That doesn't mean, of course, that nobody can have an opinion on what the right thing for Facebook to do is. But you can't abstract over the substance of the criticism and say Facebook isn't "addressing these kinds of issues" without specifying whose standards they ought to follow.


I don't understand this idea that "citizens" can hold the reins of censorship. If an item's removal is up for public debate, it was never really removed. E.g. it does not really solve the problem of Holocaust denial content if you still have people complaining that Holocaust denial content should be allowed. And it is not a democratic law anymore if you don't get to criticize it.

Censors operated in secret and beyond accountability not because of some accident of history, but because it was essential to the nature of their work.


Censorship isn't all-or-nothing though - spoiler tags being a common example. It stops you from reading it and says "warning you don't want to read this because X", then leaves it up to you. You could censor misinformation/etc with this method, yet still be held accountable.


Disclaimers can be helpful but you’re still giving those views a platform, and horrible people who want to congregate and reinforce their views to each other can still do it. I don’t think this satisfies the anti-big-tech crowd, although I’d be happy if it did.


Maybe put it in another way. If Facebook is only a platform of communication, just like telephone or email, no one is going to have a ground to complain about Facebook inability to ban or ignore crime or hate speech or whatever unpleasant thing people like to ban it.

However, Facebook choose to use algorithmic feed to provide new information to user, then it become a media company in this sense. So as a traditional media company of course Facebook has the responsibility to ensure the content is suitable for public even if Facebook is not the producer of the content. Just give user chronological feed if they don't want to invest the effort. Facebook can't maintain its business model while tried to dodge responsibility.

After all, the problem is most people accept this kind of business model and most people don't care. When social media come to power they have enough capital for lobbying to persuade the government not to outlaw their business model.


> Is it reasonable to expect Facebook to do all those things, compared to other choices that are consistent, neutral, and inevitably result in various type I and type II errors with respect to the nature and purposes of posts related to violence?

That is their responsibility. If they have an unreasonable amount of responsibility, then it follows that they have an unreasonable amount of power. The solution is to take away that power, not to live with it being wielded uncontrollably.


Facebook want to be regulated, partly because they're better equipped to deal with regulation than upstart competitors and partly because it means that these dilemmas become the responsibility of government.

https://www.washingtonpost.com/opinions/mark-zuckerberg-the-...


Absolutely nobody is holding Zuckerberg accountable for anything he says in an article like this, so it's effectively meaningless, but we can still analyze it a little for fun.

Note that he does not call for any regulations that would actually be new to Facebook. He only gives examples of things that Facebook already does, such as releasing transparency reports and adhering to GDPR. So I would read this article as "Facebook does not need to change in any way, but everyone else should be like us".

Does Facebook back up any of this article with actions? Do they support politicians who would enact these policies, or do they evenly support all points of view so they have influence with everyone? Does Zuckerberg hold private dinners with privacy advocates and supportive politicians to discuss policy and strategy? Is Facebook a prominent sponsor of any organizations that are working to make these ideas reality?


It's not uncontrollable. It's just controllable by a heterogeneous set of people with different goals, ideals, and policy prescriptions. Splitting up Facebook's power among a dozen smaller networks wouldn't change that fundamental conflict.


Facebook has a corporate hierarchy, and there are a small number of people at the top who have real meaningful power, even if they can't micromanage every decision. Splitting up their power wouldn't magically fix any problems, but it would lessen the potential damage one person or small group of people could do.

It would also limit the damage an outside actor could do by abusing a platform, the same way America's "distributed" presidential election is safer than having one centralized election or one system that every state uses.


I don't see how scale matters to the moral question. Whether they are a giant multinational company, or a little personal blog, the problem is still there. They currently do not have employees who examine this stuff or decide if it needs to be reported to the authorities. You think the law should require it? If so why then shouldn't I also be required to read what's in my email spam folder to check for crimes too? Sure it's not very likely there's much there, but we aren't talking about a new law just applied to me. It would apply to all other individuals like me too, and 300 million people being required to check our spam folders and personal blogs may well uncover far more crimes than on facebook.


Sure, there are lots of reasons it might be better to decentralize. I'm just saying that achieving specific content moderation standards isn't one of them, because there's no reason to expect new, smaller companies to favor Human Rights Watch's preferred policies.


Yeah ok, we agree. I don't think smaller companies would moderate better, just that it would be less of a concern when they do it badly. I mean there are small social medias companies right now that don't moderate at all, but it's not a pressing issue because they don't have much impact.


>Is it reasonable to expect Facebook to do all those things, compared to other choices that are consistent, neutral

I think there are two ways to think about Facebook, in the context of this article. One is as a journalistic organisation. If we go that way, then yes, I think it is reasonable to expect Facebook to make massive efforts to understand context and make informed decisions.

These decisions are impactful. If you are the primary journalistic news source for something this meaningful, then yes, you have a responsibility to do this job well.

The other way of seeing Facebook is as a neutral platform. In that case, they need to have genuinely neutral and transparent rules. Ad hoc, subjective, closed door decision making is not neutral.

The problem is that Facebook is having it both ways. So, they're getting criticism (as you point out) from both directions. That's not unfair to Facebook, it's a choice they make.


Facebook creates these features and refuses to own their consequences. They gave everyone a megaphone and a private tv channel. Now they want to run away from owning responsibility for the content which they use to bring the users who will see the ads which line their pockets. Unacceptable.

Maybe it’s time to cancel features if you don’t want to dedicate some portion of your many billions of profits to solve the problems which said features create for society.

Oh and to everyone who will say it’s hard, maybe you can’t solve these problems then you shouldn’t be rolling out a feature. It’s just that simple.


So the phone company provided a way for criminals to organize in secret and communicate with each other without having to meetup in person so the phone company is responsible for that crime?

Unless I mis-understood your argument. It seemed to be that you should be responsible for everything bad someone else does with something you made. You made a cooking knife. Someone stabs someone with it. It's your responsibility. You shouldn't have introduced cooking knives into the world knowing that people might stab others with them.


My take on the debate over internet censorship is that the "free speech" contingent badly wants to deal with it using elegant, consistent, universal principles.

While a solution that meets those criteria should always be the goal, we must recognize when it isn't possible. If we don't, we wind up with something that satisfies our ideology without actually being a good solution.

Such issues often result in new law. If any phenomenon creates a certain level of harm, a society is liable to alter its laws to deal with it, breaking with convention if necessary.

Regarding your phone company example, what that means is that the answer to your question might depend on the extent to which the criminal activity is harming society.

Regarding Facebook, I'm open to betraying my normal principles, because the amount of injury Facebook does to the world justifies exceptional remedies.


Not really, it would be akin to phone company allowing it's users to create multi-user international terrorist hotline, and refusing any kind of responsibility for it... while also sending out promotional material to people they deemed 'potential terrorists'


I think you are not taking into account that the knife and the phone are not "mass/broadcast" tools. They are one to one, one to few at most.

Moreover facebook is not a neutral conduit, the algorithmic feed act as an editor for a newspaper (and an editor that can give each person their own personal newspaper at that!)


That is a simplified argument. Facebook is not a phone company. A phone company has an amplification effect on communication, but Facebook, with a billion customers reachable through their choice of algorithms is in a totally different class. A phone company doesn’t push content on the user. Facebook does.


> You made a cooking knife. Someone stabs someone with it. It's your responsibility. You shouldn't have introduced cooking knives into the world knowing that people might stab others with them.

This is the logic used to put drug dealers and manufacturers in prison when people overdose on their drugs and die.


> So the phone company provided a way for criminals to organize in secret and communicate with each other without having to meetup in person so the phone company is responsible for that crime?

This is a bad analogy. The phone company isn't a broadcaster. Facebook is.


I don't see that. The reason that Facebook was even possible is safe harbor laws that shield them and every one of their vendors from liability.


>Is it reasonable to expect Facebook to do all those things, compared to other choices that are consistent, neutral, and inevitably result in various type I and type II errors with respect to the nature and purposes of posts related to violence?

I would say it depends on how Facebook presents itself to the wide world.

Is is just a content aggregator that allows others to share their thoughts? Nope, we shouldn't expect them to do it.

Are they a platform, where they curate the content? absolutely yes, they chose this - you can't eat the cake and still have it.


While whoever is blocking content indeed finds themselves between a rock and a hard place and will never be able to please everybody, I don't think there is only a choice between "leave it up" and "block it".

I think one solution would be to archive certain violent imagery in a nonpublic storage and give access to historians, journalists and scientists on a project basis to these. This isn't without challenges and cost, but as a humanity we have to do certain things to make sure violations of our human rights don't get flagged for deletion. I just wish our great minds would actually care about problems like these beyond chasing the next market based disruptive technology with totally unclear societal consequences.


> Is it reasonable to expect Facebook to do all those things, compared to other choices that are consistent, neutral, and inevitably result in various type I and type II errors with respect to the nature and purposes of posts related to violence?

It's reasonable to hold them accountable for the extent they're facilitating the problems. Even (especially) if it's an unavoidable by-product of FB's existence.


There's no inherent conflict between taking things down and preserving evidence - it should not be difficult for Facebook to "cease publishing" a particular piece of content (i.e. stop showing it to the public) while preserving it in their systems as evidence, together with all the metadata of who/when/how posted or reposted it.

In fact, for the main categories that must be taken down quickly (e.g. child abuse material, death threats, etc) it seems obvious that the evidence must be preserved - why not extend it to all the violent stuff that gets banned.


> Is it reasonable to expect Facebook to do all those things

Yes. It is then up to Facebook to decide if they want to play the game where they are expected and required to do that, or dissolve themselves to an insignificant enough operation where it is no more reasonable to expect that. Either you are the most significant social media company in the world and face the consequences, or you are not. This is free world to decide those kind of things...


It may not be the first instinct in a forum of programmers; but really there is a question hanging over whether something like Facebook can/should/will continue to be an international company.

At some point it could be discovered that the problem they are trying to solve is politically unsolvable. Regulatory structure starts to build up and it becomes impossible to run multinational social companies; as a topic for thought experiment.


Or we get a world government.


> Is it reasonable to expect Facebook to do all those things, compared to other choices that are consistent, neutral, and inevitably result in various type I and type II errors with respect to the nature and purposes of posts related to violence?

No, and that is precisely why their moderation should be transparent.


“Publicity is justly commended as a remedy for social and industrial diseases. Sunlight is said to be the best of disinfectants; electric light the most efficient policeman.” -- Louis Brandeis

Include pics on social media as a light form. Of course I don't want to be exposed to snuff films. But to the extent that blocking them sweeps evil under the rug that's a selfish impulse. And it's not just the existence of such evidence, such that blocking is fine if only it is preserved for a court. It is our very disgust and outrage that drives change. To protect the public from it is to abet such crimes. It's a good thing to protect children from it and a bad thing to treat us all as children.


Might exposure to snuff films be a morally appropriate consequence for financing the production of such films via tax payments?


Tax collection is coercive and therefore tax payers cannot be held responsible for the actions that are performed using that revenue. Voters maybe, but not racketeering victims.


Removed from public access != deleted

For all we know, the content still exists on the social media platforms' servers, and access is just blocked to the public, but the platforms will happily provide it to the proper authorities in response to an appropriate request.

(Of course, I can't rule out the possibility they may have physically deleted some of it, or may do so eventually – I don't know what their data retention policies are – but I imagine they would be hesitant to permanently delete content that may plausibly be of future interest to regulators – this article contains zero information on whether they have actually physically deleted any of it or not)

Human Rights Watch is of course a purely private body, so unless they reach some special arrangement with the platforms (and platforms are sometimes willing to enter into those sorts of special arrangements), they are not going to have any more access to removed content than the general public has. But that doesn't mean that government authorities don't have more access than the general public has.


Imagine trying to justify removal of public access to evidence of war crimes. That's a wild idea of what is ok.


The article describes exactly the challenge with this black-and-white framing: many things are both evidence of war crimes that ought to be available and terrorist propaganda that social media platforms shouldn't spread.


> Removed from public access != deleted

Is concealment meaningfully better than destruction here?

I don't think so.


Of course it is! Maybe I'm not understanding your perspective here, because I can't really imagine an argument for this position. If someone posted that first video on HN, the one where a guy gets shot in cold blood, surely you don't think it would be equivalent to destruction of evidence for the mods to take it down.


Without public knowledge and access to the media directly, what incentive or pressure does government have to pursue consequences for these crimes?


What I find ironic is that this is the exact logic Facebook made for not taking down posts from politicians (the political speech exemption) and then most people didn’t find that very compelling.

Perhaps the content should just be hidden and hard to access to reduce vitality but still allow people to view if they want? That wouldn’t satisfy everyone either though.


> Without public knowledge and access to the media directly, what incentive or pressure does government have to pursue consequences for these crimes?

I agree with you, but to play devil's advocate, without public knowledge and access to the media directly, what incentive or pressure do terrorist organizations have? If the whole point of these acts is to instill fear, if they are not well-known the impact is limited. I can see that argument but don't have a good rebuttal.


You don’t combat terrorism by not talking about terrorism (aka censorship).


Why not? For most things I'd agree with you, but in the case of terrorism specifically (whatever exactly that word means) I'm not convinced. Isn't the goal of terrorism to inflict terror or raise awareness? If no one is talking about your act of terror you've failed at your goal.


Because terrorism is used by small groups to inflict tyranny over larger groups, and if those larger groups self-inflict tyranny in the form of censorship, they reach the same end as terror attacks themselves.

Just like poker, terrorism isn't a thing that exists in a vacuum: it exists inside of, and in relation to, wider society.


I mean, the fact that they happened.

The existence of the war crime allegations isn't being removed.

People horrified by the war crimes themselves provide the incentive/pressure. Photos/videos don't add to it, they just confirm the evidence.

I really don't think any victims' families are saying "oh well, because we can't see the videos anymore I guess we won't bother the government after all".


The fact that crimes happened is not pressure upon governments to act upon them, no.

Governments generally only respond to public pressure or existential threats to their power and control and revenue, and little else.


> The existence of the war crime allegations isn't being removed.

Oh, but it is. The direct witnesses' accounts are minimized and ridiculed for lack of evidence, then the witnesses stop talking about it (normal ones; the crazy ones keep ranting about it forever) and the matter is totally gone within a generation. A historian might dig it up a few decades later to make a career and by that time it's too old to be relevant.

Consider these major events before they were part of accepted reality: NSA spying, coronavirus, the Holocaust, Uighur ethnic cleansing, Tuskegee syphilis experiments, etc.


Do you really think the Iranian government is going to subpoena twitter to acquire evidence that Iran committed war crimes and then prosecute themselves?

Citizen outrage is the only lever we have to prevent governments from committing war crimes. If you support blocking access to the public, you're essentially saying these governments should be allowed to commit war crimes.


> Do you really think the Iranian government is going to subpoena twitter to acquire evidence that Iran committed war crimes and then prosecute themselves?

Another country's government could. War crimes can be prosecuted by any country (under universal jurisdiction), by the country on whose territory the crimes were committed (territorial jurisdiction), and also possibly if their citizens are among the victims (passive personality principle).

Even if in practice prosecution isn't feasible (such as due to inability to apprehend/extradite the defendants, a legal system which disallows trials in absentia, limited availability of evidence, etc), at least a foreign government could hold a formal investigation and publicise the results. Subpoenas aren't restricted to criminal cases, they can also be used for formal public inquiries (such as the concept of Royal Commissions found in Commonwealth countries, or the 9/11 Commission and Warren Commission in the US), for inquiries by the legislature (inquiries by committees of Congress/Parliament/etc), for intelligence collection, etc. It is also possible for a national government to subpoena data from private companies and then provide that data to an inquiry by the UN (or one of its agencies).

There is also the International Criminal Court (ICC). Iran is not a member state of the ICC, but the ICC could exercise jurisdiction over Iranian citizens in one of three ways: (1) if Iran were in the future to join the Court, or consent to its jurisdiction in a specific case; (2) if the Iranian citizens are accused of committing war crimes on the territory of an ICC member state; (3) if the UN Security Council made a referral. Now, in practice, Iran probably isn't going to join (or consent to a specific case), and China and/or Russia would probably use their Security Council veto to protect Iran, but method (2) might still work.

If some country's government, or the ICC, wants to subpoena data about Iranian war crimes from a social media platform, the social media platform is likely to comply (although there are all kinds of complex political and legal factors involved.)


I hoped the article would go into more depth. For example, propagandists store caches of videos and imagery on distributed servers to keep circulating it, and manually or automatically run imagery through filters that distort the image enough to fool computers but are still easily recognizable to humans. In some cases this is done to such a degree that it becomes a kind of in-group aesthetic, a style distinctive enough to be its own signature and obviate the need for attribution.


This is fascinating and I would like to read more about it.


Unfortunately most of the high quality writing about it tends to focus on state actors, or else be hand-wringing 'extremists are terrible, why won't social media companies do something'.

This is a decent big-picture article but like anything to do with memetics it's kind out of date already. https://johnkoetsier.com/fake-news-the-truth-about-lies-with...



Censorship has innate negative nth order consequences. To fight temporary discomfort we destroy history.


yes, this should be upvoted.

"we destroy history" reminded me of these guys:

https://www.google.co.uk/amp/s/www.bbc.com/news/amp/world-mi...

> As IS notes in the eighth issue of its own publication, the glossy Dabiq magazine, they see ancient cultural heritage as a challenge for the loyalties and legitimacy of Iraqi or Syrian people to IS itself.

Destroying such heritage is thus a part of their duty, as they see it, to reject such a "nationalist agenda" that the statues, temples, and indeed, cities represent.

To some degree, describing such desecrations as a "war crime", as the UN has , nicely plays into IS' hands - as do articles on the subject.

But the internet cannot be un-invented, and unless we are to surrender some of our closest held beliefs on freedom of speech, we cannot stop dissemination of such depressing stories.

We must, therefore, respond however we can.

Calm reasoning exposing the hypocrisies, the practicalities, and the banalities of IS' policies is a step towards demystifying and debunking the likes of IS as just yet another political organisation.

---

i guess we surrendered our "closest held beliefs of freedom of speech" in just 5 years.


This is the truth.


This seems like a good thing. The article does explain how you probably shouldn't use what you see on Facebook as evidence in court.


"If you don't read newspaper you are uninformed. If you do read a newspaper, you are misinformed" --Mark Twain (b. 1835) goo.gl/KXNf9


This has been a systemic issue reported on for years; e.g. reported by the Intercept in 2017 [1] and Atlantic in 2019 [2]. Not really made clear from the story considering the Economist headline is almost identical to the Atlantic one.

[1]https://theintercept.com/2017/11/02/war-crimes-youtube-faceb...

[2]https://www.theatlantic.com/ideas/archive/2019/05/facebook-a...


In alot of countries, it's illegal to tamper with evidence of a crime. Would this destruction of evidence count? https://en.m.wikipedia.org/wiki/Tampering_with_evidence


Yet another one of the myriad negative externalities that have come from the rise of cloud everything and people forgetting how to download their media.

Own your bits, people.


A tape drive is, as far as I can tell, the only reasonable back-up medium. Even then, where file format is concerned, our industry doesn't care about promoting a 1000-year format for file-storage.

If you back up online, you delegate privacy and longevity to a stranger. If you back up to DVD, watch out for bit rot. If you back up to HDD, watch out for mechanical failures. If you back up to SSD, well, failure typically means the whole thing is dead.

Even if you have a medium that can keep data alive for more than a couple decades, chances are, when the time comes, you'll be stuck doing a lot of research to access old disk and file formats.

Tape drives really ought be a staple item, in the consumer market, not just enterprise, and companies like Google, Microsoft, and Apple ought to guide customers to a common standard to ensure their archives will last a life-time.


RAID-1, block-checksumming filesystem, regularly-scheduled scrubs.

Everything rots. Entropy is unavoidable. Math outlives physics.


But we're talking about much more current temporality. Evidence disappearing because nobody thought to run youtube-dl or somesuch.


The problem is modern tape drives cost $5000


That's my secret agenda ;) I want to see affordable tape drives! If the number of people who need a tape drive, but don't know it, and won't find out till its too late, knew they needed one, there'd be more competition in the market.


They only exist because of the cloud.

Normally they would never see the light of the day.

We just need to adapt. Things social media delete for a 'reason' probably matter.


Good. I mean we need more such articles, so that just like the BBC could shame the Cameroonian government into action, we can shame tech companies into action too.


It wouldn't be hard for the feeds of these items to be fed to the appropriate governmental and non-governmental agencies, gated by some reasonable process.


Facebook has killed Rohingya. Mark Zuckerberg has blood on his hands.

The U. S. has the ear of Facebook. Facebook listens to the E. U. but threatens to defy them. It largely ignores international human rights organizations though.

> In June, The Gambia filed an application in U.S. federal court seeking information from Facebook that would help it hold Myanmar accountable at the International Court of Justice (ICJ).

> Earlier this month, the company filed its opposition to The Gambia’s application. Facebook said the request is “extraordinarily broad,” as well as “unduly intrusive or burdensome.”

https://time.com/5880118/myanmar-rohingya-genocide-facebook-...

Edit: "Facebook staffer sends 'blood on my hands' memo" https://www.bbc.com/news/technology-54161344


Does anyone actually give a fuck about this? Everyone wanted Facebook to be where people go to see baby photos. Now it is that thing. It's fine. I see lots of baby photos and my life is good.

So go upload to Liveleak if you want to post pictures of people shooting hooded people. Bro I don't want to see that and previously I was okay at avoiding it on Facebook. Then everyone decided that it was important that Facebook remove anything political that could be spun. Now they're doing that. Great. Now you want to complain about that?

Listen, I'm going to be honest. No one gives a fuck about war crimes beyond a "Oh no, this is horrible" thing. If you asked people to contribute a dollar to fighting them you'd get zero dollars.


There has to be a limit to cynicism, at least on a discussion forum like this

Twitter partially addresses this issue by letting you flag an account as NSFW. If you're going to post war crimes or nudes, great, I don't have to see your content even if it's retweeted - I can shut off NSFW content


Honestly, there's probably money in making people feel better about outrageous things in some way. I think Reddit does an okay job here since they let people bicker on subreddits and 'award' people on either side. They sell people the illusion of impact, and that's probably worthwhile.

So I think I've changed my mind completely. People don't not care about war crimes, etc. They care about them through the identity of being someone who cares about war crimes. In that respect, it is amazing what Facebook is doing since they are throwing away a pretty concrete engagement opportunity. Fascinating.


I want to see nudes, but not gore so the NSFW filter doesn't work.

I want to see posts about accomplishments, but not instagramable food. I want your honest accomplishments, not your drama. I want to see pictures of puppies, but not babies.

In other words, what I want is different from what other posters want. This is fine, until you throw us all together into the same area. Then we end up having all these fights.


Whoa.


Ugh. Activist organizations trying to make social media do their work for them. While keeping the donations for themselves.

Facebook just can't win. They should just start ignoring such demands.

Want evidence of war crimes preserved - then do it yourself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: