Hacker News new | comments | show | ask | jobs | submit login
UN blames Facebook for spreading hatred of Rohingya (theguardian.com)
39 points by IBM 6 months ago | hide | past | web | favorite | 69 comments



I'll say this again:

We, as a society, across multiple governments and now the UN, with ample public support, are dragging Google and Facebook into the role of judge and jury of what is or isn't acceptable speech.

They may well eventually settle into that role, and come to like it.

I understand that the volume of information grew much faster than the traditional legal avenues for taking down illegal speech. And we're scrambling for a quick fix. But this fix may prove worse in the long run than the problem it fixes. These organizations may be relatively enlightened today, but there's no guarantee they'll continue to be 30 years from now.


It's not about volume or anything like that, the problem is that Facebook is a digital simulation of the social networks with all the stimulants being multiplied and all the safeguards dumbed down.

On a real-life social network, there are safeguards that prevent blatant lies and angry-mob-like behavior to be contained. In real life, you have only one persona and when you misbehave you're risking exclusion and loneliness but on Facebook, that's not the case, a person with an agenda can do very bad things without the consequences.

Another case in a real-life social network it would be very hard for a state to shape society, on Facebook that's almost the business model and it's done on scale.

I'm' personally quite pessimistic about the long-term feature of these social network giants because I believe that their modus operandi will eventually create enough trouble that every country will follow the Chinese route of control.


> On a real-life social network, there are safeguards that prevent blatant lies and angry-mob-like behavior to be contained

Sadly, no. It just lacks "reach". It's harder to meet up with like-minded sociopaths and get a good angry mob going when every meeting has to be in person and every lie must be repeated verbally, but it's still very possible.


Actually, the place to find like-minded sociopaths is probably Reddit. The place to explore the boundaries of sociopathy is probably 4chan(which is something that I respect to some degree).

Facebook is more about shaping non-sociopathic people using corporate tools and money. What Cambridge Analytica did with Facebook is outstanding, they successfully managed to change Europe and America forever.

Sociopaths gathering on Reddit or 4chan are benign compared to the corporate influence of Facebook.

The potential destruction of businesses(like UK's financial sector), the lives and careers of expats, the fate of dictators(the Arab spring) and democracies(USA, France) were all determined by the power of giant social media corporations. The harm of occasional mobs on Reddit or 4Chan doesn't extend beyond few individuals.


Again, I recognize there is a big problem. Also, I recognize that I am not at this time contributing a proposed solution. But I hope to convince that the solution we're sleepwalking into is really, really bad.


> are dragging Google and Facebook into the role of judge and jury of what is or isn't acceptable speech.

I couldn't have said any better. Later people might have an issue on why a "public" company is now moderating free speech.


Any successfully productive and civil online community has in my experience been community moderated.

If you want to participate then you follow some simple guidlines and contribute in a positive way, which can include differences of opinion.

All of the platforms failing to provide productive and civil discourse are those that have no community moderation, or worse, are just personal soapboxes with an effortless virtue signal button.

I think the platforms are broken by design, and humanity figured out a long time ago that there are limits to uncivil discourse and you need to recognise them.

The great thing about the internet is if you feel passionate enough about your idea, you have probably thought hard about it and you can go make your own community.

The issue with effortless virtue signalling and the no-consequences soapbox is that you don't have to think that hard about something so long as it sounds like it's up your alley, and you can be led very far astray before realising you're in a toxic conversation, potentially one with a subtext you didn't recognise.


The no-consequences soapbox has existed for a long time even before Facebook. In South Asia there has been a history of people spreading false information via SMS and MMS too. Here's one recent malicious campaign, if not exactly a toxic one, from last year November in Sri Lanka:

http://www.ceylontoday.lk/print20170401CT20170630.php?id=345...

A malicious and scurrilous countrywide sms (short messaging service) campaign over mobile phones, by unknown unruly elements, led to panic buying of both petrol and diesel and long queues in the Kandy District last night.


Eh? "Public" companies have always moderated speech. I can't think of a time when they haven't. Even if you restrict that to internet companies then even Facebook already does moderation (although it seems wildly inconsistent). Heck 4chan has some moderation albeit of a very limited kind.


Facebook dragged themselves into that role when they decided to filter their newsfeed to promote the material which provoked the most reaction, and made a point of policing stuff they did care about, like pseudonyms and mild nudity.


> We, as a society, across multiple governments and now the UN, with ample public support, are dragging Google and Facebook into the role of judge and jury of what is or isn't acceptable speech

Is anybody seriously advocating self regulation? When this comes up in my New York political circles, talked-about solutions have been bouncing between regulation by an independent commission and by a department under the Attorney General.


When FB and Google don't moderate enough, they're publicly criticized by authority figures, wide news coverage, demanded to respond, occasionally threatened to be brought in line. On this occasion, they're implied to be complicit with genocide.

When they moderate too much, we all shrug, less coverage, growing pains, cannot be expected to be perfect all the time, it's-their-platform-they-have-the-right.

That incentivizes them to moderate more, always more.


they should NOT moderate discussions themselves but they should offer mechanisms for people to flag and downvote, and reflect the dissent in the visualisation of posts. (possibly even with automatic educational links about values)

Actually I hate how flagged HN items completely disappear. Graying out should be enough here.


I dunno. I mean yes in theory, but we don't need the theory, this has been tried multiple times and there two basic ways that it ends up:

- Factual correctness and overall decency loses out to popularity— making a funny, low-effort post will in many communities get you a lot more magical internet approval points than doing the work of engaging with an issue, seeking alternative sources, etc. Yes, there are counter-examples of this, but the long term trend is always toward a low quality of dialogue. - At a certain scale, posts are simply concealed from a user who is likely to disagree with them— basically pre-moderation. Being challenged on a viewpoint requires energy, and people don't like to do work.


Set showdead to true? I almost forgot that was a setting, must have been ten years since I changed it. Looks like you've been here a decade too though (as of yesterday! (didn't realize they changed days to date joined on here...))


Flagged items disappear for a reason: to not give free rank to shitty articles.


No, we are saying that social media shouldn’t get “common carrier” status, which isn’t quite the same thing


Could you please elaborate? What's the difference between "they're responsible for what they host" and "they should decide what people can or can't say on their platform"? (Or, if I misunderstood, what are you calling "not quite the same thing"?)


Facebook et al want to control what content is published on their platforms when it suits them, but also want to hide behind common carrier status to evade responsibility. Having their cake and eating it. Or privatising profits while socialising losses, is another way to look at it.


“A couple of hours outside Yangon, the country’s largest city, U Aye Swe, an administrator for Sin Ma Kaw village, said he was proud to oversee one of Myanmar’s ‘Muslim-free’ villages, which bar Muslims from spending the night, among other restrictions.

‘Kalar are not welcome here because they are violent and they multiply like crazy, with so many wives and children,’ he said.

Mr. Aye Swe admitted he had never met a Muslim before, adding, ‘I have to thank Facebook because it is giving me the true information in Myanmar.‘“

https://mobile.nytimes.com/2017/10/24/world/asia/myanmar-roh...


It's amazing and horrifying how easily people can be convinced to murder someone they've never met based on lies presented as truth. One of the lessons of the Holocaust that has to be continually re-learned.


It's so sad to see that some people are actually proud of being brainwashed by propaganda via social media.


Facebook doesn't spread hatred, people spread hatred.

Sure, Facebook could crack down on the way people are using their service/their tool, but I don't think they are responsible for the actions of users.

What about:

- Broadband internet is spreading hatred

- Chrome Browser is spreading hatred

- ISPs are spreading hatred

(etc)

Don't blame a company, or the service they provide. A crackdown on Facebook will force the hating people to another tool or communication method. They'll probably move to a more encrypted method of communication, which will make finding the wrong-doings and the people much harder.

The only thing they could do is ask Facebook to cooperate and provide details to find the haters.


> Don't blame a company, or the service they provide

Facebook is making billions by exploiting exactly that kind of behavior. The more one-sided and scandalous a headline is, the more clicks it gets. This leads to being seen by more poeple.

Also, showing people only what they want to see leads to echo chambers and is ditremental to society as those groups tend to develop into extremes that refuse to accept anything outside of their bubble.


> Also, showing people only what they want to see leads to echo chambers

Not sure I agree with this. The internet made it possbile for the whole world to see information coming from all kinds of sources. Before, in history, you had to rely on local news/newspapers, or even less information. Much more bubbly than this modern age.

Again: I think the reason those bubbles appear have to do with society and the urge to belong to a group, Facebook is not to blame here. Facebook reflects people and society, not the other way around.


I don't know about the US but at least in Germany newspapers and even tabloids have certain self-imposed rules regarding fact checking and privacy. All those rules are out of the window on Facebook and other social media and because everybody is anonymous (yes, even Facebook), nobody can be held accountable.


This is insane...

Let’s start with logic: using Facebook is a clear indicator that the platform is preferable for the spreading of hatred, compared to alternatives. Blocking this avenue will therefore harm the propaganda effort. The discussion can only be about the magnitude.

And there is some obvious support for the idea that the magnitude will be significant: no social media platform has made significant inroads against FB’s reach. People aren’t actively searching for sources of hatred, they simply happen unto the content mixed among their friends’s baby photos.

Just look at the barren cesspool of hatred that is voat: it doesn’t have anywhere close to the reach of the market leader these xenophobes were kicked out of, Reddit.


I don't blame Facebook. What I blame are politicians and special interest groups making the claims they do without revealing the true reason they are upset, they no longer control the message or distribution of the message.

regardless of the factual content politicians and special interests groups always were able to control what we were presented with as truths and lies through the cooperation o the press who in many nations is not independent and even in those where it can it falls into the trap of wanting access so it simply becomes a mouthpiece.


> which will make finding the wrong-doings and the people much harder

This is exactly the point: to push hate-mongering back into the fringe it belongs to. Law enforcement will still be able to crack down on them, have no fear.


There were massacres before Facebook, and there will still be after it. As much as I dislike Facebook, they are just a platform and if they weren't here, another one will take their place so it's a bit weird the UN investigator would blame them directly. Ultimately people, not FB, are responsible for what they do, and for being too easily manipulated by online propaganda.


> if they weren't here, another one will take their place

This is a cop-out. Hateful false information, with violent consequences, is being spread by a company profiting from its distribution.

We don’t say, in the event of food poisoning, “we can’t blame the restaurant because if they weren’t here someone else would be poisoning people.” We fine the business and send a message to the industry that said behaviour is unacceptable.

Facebook is contributing to and profiting from atrocities. That rightfully merits scrutiny, of Facebook and ad-driven social media companies generally.


They simply provide a way to communicate between people. When a newspaper incites hatred, we might blame the editor, the writers, or the readers, but we don't blame the paper company.

What I mean is that the service provided by Facebook is somewhat trivial - it can easily be replaced by emails, chats, or any of the countless ways to communicate via the internet. Shutting down Facebook will not solve the problems in Myanmar.


That's a terrible analogy. Unless there's a cabal of restaurants with a plan to poison people, it's probably an accident. Spreading hate like this however, is deliberate, and if there wasn't facebook, people doing this would most certainly find a different venue.


This isn't a cop out. Anything different and Facebook would be blamed for censorship. This is a difficult problem to solve. It's practically human nature to form a mob and group think. It's one of the first things that kids do in pre-school.

I think the only solution to this problem is education.


> Anything different and Facebook would be blamed for censorship

Facebook isn’t concerned about being “blamed for censorship” when it benefits their bottom line [1].

> This is a difficult problem to solve

The first step is recognising that ad-driven reaction-oriented social media does not appear to positively self-correct.

[1] https://mobile.nytimes.com/2016/11/22/technology/facebook-ce...


I get that social media is bad for you. The question is does the rest of the world agree? Perhaps I can liken it to sugar. It's is bad for you in large quantities but it is legal.


The design of the system definitely contributes to how easily malicious propaganda can be spread through it.

The tendency of the news feed to promote attention grabbing content biases it to extremes, and this is definitely a systemic problem, and shouldn't be ignored and swept under the rug.


"There was food poisoning before this restaurant that failed inspection, and food poisoning will still exist even if we close it. If it wasn't there, another one will take its place. Ultimately people, not restaurants, are responsible for what they eat."


For context: https://en.wikipedia.org/wiki/Radio_T%C3%A9l%C3%A9vision_Lib...

Eventually - decades after the massacres in Rwanda - those responsible for the genocide-promoting radio station have been given long prison sentences.


That’s a sad idea of powerlessness.

It’s also wrong: Facebook doesn’t have a competitor in its core market, and the idea that people would start thinking “you know, Facebook is great. But I just wish there were a platform just like it with a bit more calls to violence against minority groups” is ludicrous.


Those responsible for the propaganda against the Rohingya used the radio before, now they use Facebook and after that they'll use something else. Education and a proper government that would censor the hate groups could help solve the problems in Myanmar, but eg shutting down Facebook would not change anything.


> shutting down Facebook would not change anything.

You’re just asserting this without any argument, even though it was addressed in the post you’re responding to.

What would be used instead of Facebook? If that other platform is as convenient and effective as FB, why is it not used today?

And since you are agreeing that censorship of hate could help: why do you believe the appropriate distribution of the responsibility for such censorship is more important than stopping a genocide?


Ironically that seems to have happened with reddit - there are a couple of competitors which are "just like it with a bit more calls to violence against minority groups".


And they are completely irrelevant and have nowhere near the reach and influence that posts on reddit have, nicely proving my point.

The content on these sites is also so far away from common decency that people will not the mistake of thinking that messages they read there are acceptable in common discourse.


As much as I am concerned about Facebook's reach this story presents a rather narrow vision of the problem. The region has earlier seen people spreading hatred and misinformation through plain old SMS and MMS too.

The only way governments tend to control this is by blocking mobile/sms services. In this case unfortunately the only way around is to block Facebook or internet services until things calm down.


Too bad the Rwanda genocide happened in the 90s. If Facebook was around back then, we could have washed hands and blame them for that too.


As it happened, the "social media" of its day was a pro-genocide radio station. They were not only blamed (correctly) for it, but the directors were arrested, charged with crimes against humanity, and sentenced to over 30 years in prison.

https://en.wikipedia.org/wiki/Radio_T%C3%A9l%C3%A9vision_Lib...


By that logic, Nazis weren’t responsible for WW2 because they weren’t around in the time of the 60 years’ war.


I went to Myanmar at the beginning of 2015 for a few weeks, and was blown away at how many people had cell phones. It was only a few years before that phones were unobtainable due to extremely high sim card prices (only one provider). Once they allowed in competition (2 other networks), the price flew down to sub $1 per sim card, and in a flash everyone was online.

What really stood out was how much of the country ran on facebook. Groups for everything, pages for businesses, messenger for communication, etc. It resembled being in china and having everything run through wechat.

I guess this is one of those consequences.


It happened last week in Sri Lanka too. Although the government was quick to block FB i think 3 muslims died.


It happens in India too. Whenever there is a mass movement, government tries to stop social media. And everyone starts blaming government for curtailing free speech. But I think there is a really good reason to do it considering the lack of resources to effectively police the population. One bad example I came across recently was this news: fake whatsapp/facebook messages led to the killing of 7 men

https://www.hindustantimes.com/india-news/a-whatsapp-message...


I heard a story in Singapore. I don’t know how much truth there is to it but basically: there was conflict going on in China, and every time the news told the story the issue grew worst. China asked singapore for advice and LKY said to stop posting it the news and no one will know about it, and it won’t grow and it will die down. So China forced the media to stop talking abou it and the conflict died down in a few days.


Militant Buddhism must be really blocking some chakras in the West.


Amplification of a message is the problem and not being a medium of exchange of messages.

Facebook can/should only be held responsible for amplifying a message and should not be accountable for being a medium.

The information on 'Who' paid 'When' and 'How much' to amplify 'What' message/post to 'Whom' should be publicly accountable. This model should apply to any AD agency.


>Facebook was a huge part of public, civil and private life, and the government used it to disseminate information to the public.

Seriously? I know they probably have bigger things to worry about in Myanmar, but that's pretty crazy.


Which part? That the government uses the most effective communication channel to disseminate information?


Why is Facebook to blame here exactly? I'm not a fan of the company at all, but, as I've said in earlier topics on more of less the same thing, I don't think it's fair to hold them responsible for what people post on there more than it is to hold internet providers responsible for people using their connections for the same thing.


Because Facebook already controls what people say and see on their platform. But they choose to use that power to censor boobs and show you whatever will provide the strongest response.


> they choose to use that power to censor boobs

Do they, or are they forced to do so?

And as I said: I don't like Facebook. I think they're shifty and probably do have an agenda. However, it's untenable to attempt to force them to censor anything that a government doesn't approve of, or hell, society doesn't approve of. Facebook has 2.2 billion active users which make post 510,000 comments, update 293,000 statuses, and upload 136,000 photos per minute. It is impossible for a company to go through all of them in order to censor those which break whatever law of the country it was posted in.


On the one hand you're saying that facebook are effectivly "forced" to censor nudity (which may actually be legal in some jurisdictions), and then you immediately say it's "untenable"?


I don't see why stating two facts, namely that governments want facebook to do something (for example Germany forcing Facebook to censor what they consider hate speech) and that this situation is untenable is in any way contradictory.


Do you really believe it’s illegal to show breasts on the internet?


On the internet? Difficult question.

In certain countries, to people below a certain age? Certainly.


yesterday I came across a YouTube video with 1M views where the expulsion of white farmers from South Africa was demanded. while they might have a point, the language in the comment section was so harsh that it cried for some form of moderation.


Change white to black, or jews, or... and see if they "might still have a point".


Context matters. South Africa has the largest inequality in the world. Colonialism created a society where the descendenantd of armed invaders control 80% of th farmland. Whites generally live a comfortable middle class life, while the native population often lives in abject poverty.

South Africa is also a glorious example of reconciliation, overcoming a history of injustice without violence.

But the farming community simply has not used the chance they had in the last 20 years to become part of this historic process: while universities and many other groups and institutions have de-segregated, that conservative community has dug in and defended their privileges at all costs. They invited the government to eventually force their hand.

Expulsion is obviously a bridge to far. And Zimbabwe stands as the obvious example of what not to do. But some sort of legislative land reform is now inevitable.


> overcoming a history of injustice without violence

I'm not sure the truth and reconciliation commission would entirely agree with that.


That ignores the entire history of South Africa but go nuts.


UN is to blame for keeping religion alive.




Applications are open for YC Winter 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: