Hacker News new | past | comments | ask | show | jobs | submit login
UN: Facebook has turned into a beast in Myanmar (bbc.com)
504 points by abhi3 on April 2, 2018 | hide | past | favorite | 297 comments



This is a case of government authorized genocide. I just don't understand how shifting blame toward Facebook as a data sharing platform solves the problem. If someone sends hate filled letters to public via USPS, is USPS complicit? Yes, FB could have probably done better by monitoring the content, and muted hostile posts. I hail from a small industrial town in eastern India, where very recently racial tensions erupted between pockets of Hindu/Muslim communities. One of the first things local government did was to turn off internet (data and ISP), to control proliferation of rumors and anything that could incite further violence.


If they get credit for the Arab Spring, they should get credit for this, too. This is not solving a problem, this is a part of understanding how the world is changing and what is contributing to it.

If we don't identify and understand what is going on, we won't have an informed response.


> If they get credit for the Arab Spring, they should get credit for this, too.

Facebook shouldn't get credit for the Arab Spring. They didn't start it, they were just one of many mediums used for its organization and dissemination.

What they and all social media get credit for is existing as a communications platform that's largely uncensored.


If the government controlled in what order you saw things on television, or whether or not you saw some of them at all, would you call television largely uncensored?

Why does Facebook's algorithmic ranking of feed content get a free pass, here?


Television is censored. So is radio. In exchange for the use of spectrum, media companies agreed to restrictions on content. Even beyond the injunctions against certain behavior (swearing, nudity, etc.) broadcasters are subject to positive responsibilities about how they use their airtime. TV networks are legally obligated to run news programming.

The FCC is insulated enough from partisan politics that its censorship doesn't run along partisan lines, and is therefore not very controversial. But TV is absolutely censored.


Broadcast television is probably censored everywhere, but we're talking about Myanmar (and in this subthread, India) so the FCC is not highly relevant.


Crediting technology platforms with the Arab Spring is a really misleading and troublesome narrative that a lot of people blindly accept.

"Credit" confuses causal importance with (human) responsibility.

Normal Borlaug engineered a variety of wheat that fed a lot of people and saved a lot of lives. What deserves credit? The wheat or the inventor? Or Mendel et.al, for pioneering the principles of genetics that Borlaug used in his work? Even if "someone else would have done it if Borlaug didn't", he is still (one of) the most important human actors responsible for averting the crisis. The materials/products used are causally important; not humanly-responsible.

In this case, humans were responsible for the Arab Spring. For the most part, social media platforms were causally important to the process.

It gets much harder to draw that line, though, when the platforms are behaving in activist ways (e.g. $literally_any_technology_headline_in_the_last_week).


*Norman Borlaug. Not "Normal".


> credit for the Arab Spring

Is it still viewed as a positive event? Any countries where it lead to improvements instead of instability?


In retrospect, they might not want credit for the Arab Spring either.


If Facebook delivered every single piece of content to every single recipient, the way the USPS does, then maybe the analogy would hold. But they don't. They use algorithms to determine what to show and what to hide, and IMO they are responsible for that.

Imagine if the U.S. Postal Service could keep track of which pieces of mail get opened or read, and then they decided to only deliver the mail that made people the most angry. Sound like a good postal service?


USPS has rules about what can and cannot be sent, and it seems like they can open your mail if they suspect it contains something illegal [1].

[1]: https://gotogreatpanes.com/blog/2014/02/03/can-usps-open-my-...


That seems pretty different from what Facebook is doing. Facebook is pretty obviously curating content, but USPS is not.


That's because people can ship dangerous things in the mail. The post office isn't controlling information sent in the mail.

There's a big difference between opening something up (many times needing a warrant, as your link says) because it may contain a weapon or biological agent or invasive seeds of a plant species, vs controlling what information is passed or not.


This isn't an academic debate over who is technically responsible. It's a genocide, happening right now. Facebook should be doing everything they realistically can to use their position to save lives.


There is no genuine moral outrage here, anybody with half a brain knows that Facebook is no more to blame for this genocide than the printing press was to blame for the holocaust. Governmental institutions are currently making a concerted effort to falsely blame social media for society's woes in an attempt to rally support for state regulation and control of social media platforms.


Actual comment from the article:

"A colleague acknowledged that the service had helped people in the country communicate with each other."

It leaves me dumbfounded that a journalist can write that sentence and not see anything wrong with the whole argument.


> Facebook is no more to blame for this genocide than the printing press was to blame for the holocaust.

The printing press was a physical machine. How anyone can posit this as an argument with "half a brain" is beyond me. These are completely different scenarios.

Facebook can and has a long history of curating content. That immediately makes it different entirely from the printing press. Comparing it to a newspaper is fair-it curates content. Sometimes it is biased, willingly or not.

It's a shame that on HN, whenever the industry has issues, it's immediately ignored and shuffled under conspiracy theories/government takeover/etc. There are valid concerns here-never-mind the fact that entities like Facebook and Google are essentially becoming governing bodies.


I think hate speech on Facebook is just the most visible problem. A more sinister use of Facebook is to use friend lists to identify people to target in real life. A bad actor might not even leave any record on Facebook itself that they're doing anything wrong.

To work off of your post office analogy, consider if it were possible for random third parties to inquire of the post office who some particular person has corresponded with in the last several years and consider how that could be abused in a dysfunctional society where being friends with the wrong person could have very bad consequences.

I don't know if there's a solution to that, other than for Facebook to be much more restrictive about who can see a user's list of friends. However, that would interfere with the way people ordinarily use Facebook.


USPS declined to carry leftist periodicals during the Red Scare. It's no bastion of free speech.

In the case of a government authorized genocide, however, it's likely the whistleblowers who would be silenced by government regulation of political speech.


What a lot of Facebook defenders on HN seem to be ignorant about is the context of the finger pointing. FB stopped being a "data sharing platform" when it muscled its way into becoming synonymous of "the internet" in much of the underdeveloped world. In many of these countries (including Myanmar) people can get a really cheap plan that offers next to nothing beyond "Facebook Free Basic", FB's walled garden (e.g for 3$/month you get 100mb of data + unlimited facebook). This is effectively a two-level Internet with FB being the obvious default. It's not as if FB was unaware of the potentially detrimental effects, they just chose to be blind to them https://www.mmtimes.com/business/technology/20685-facebook-f...

> In 2015, researchers evaluating how Facebook Zero shapes information and communication technologies usage in the developing world found that 11% of Indonesians who said they used Facebook also said they did not use the Internet. 65% of Nigerians, and 61% of Indonesians agree with the statement that "Facebook is the Internet" compared with only 5% in the US. Source: https://en.wikipedia.org/wiki/Free_Basics#User_experience_re...

Facebook certainly isn't the main responsible, but it's an effective catalyst.

Consider that people in Myanmar have been isolated from the world for the better part of the last 40 years. They're effectively playing catchup with the rest of the planet. The country is already afflicted by all the symptoms you can attribute to dictatorship, tribalism, various conflicts, lack of information, lack of education, superstition, lack of infrastructure, etc. Take that as the base context and then introduce the population to technology (smartphones) and "the internet" (mostly via FB). Step back and watch bad things happen... much faster.

Regardless of the article not pointing all fingers to all culprits, what's concerning to me regarding specifically Facebook is their recklessness especially in some of these emerging countries. Maintaining peace in many of these places is already difficult and the various agreements are fragile, but FB in their pursuit for growth and profit has maintained an attitude of continuous and utter denial when it comes to their product's potential as an instrument of propaganda. That to me is to some extent almost insulting.


There's a cost (both time and money) associated with spreading knowledge through USPS or similar physical mediums. The cost of spreading ideas is virtually free on social media so there isn't much of a barrier to prevent the ideas spreading like wildfire. Good or bad.


What about the costs of having the platform centrally regulating the content on the platform based on an arbitrary, context-free, highly politicized, and ever expanding definition of 'socially unacceptable' content?

We all knew that when FB and Twitter started to centrally control content well beyond the obviously really bad stuff (ie, gore, child porn), that mandate would forever expand and expand, where it's almost impossible for FB to not be criticized for not doing enough ...Absent a massive expansion of content controls, which means massively expensive, which mean incentivizing a) simply limiting the ability for people to communicate on the platform AND/OR b) automation. Which ultimately means countless false-positives and examples of bias by machines trained by the most vocal special interest groups deciding what is okay and not okay to say to another person.

The future is going to feature some interesting trolling to see who can game algorithms to get topics banned on social media through phony media outrage campaigns, false reporting, and social engineering.


It's especially depressing because the U.N. was founded to prevent this very thing from occurring, and they have nothing to say about it but to blame an American company.


> One of the first things local government did was to turn off internet (data and ISP), to control proliferation of rumors and anything that could incite further violence.

It's so good to hear that there are some responsible governments out there with respect to the dangers that Internet rumors can bring in volatile situations.


> I just don't understand how shifting blame toward Facebook as a data sharing platform solves the problem.

Isn't this a bit of a red herring, though?

First, "shifting blame toward Facebook" is not the only thing being done. People are finally allowing themselves to become somewhat aware of the toxicity and perils of data[0]. Facebook is just the first (and one of the biggest). Many people (esp. on HN) already knew, but even among them, a lot preferred to believe it probably wouldn't happen, or just not think about it much (like climate change; faced with a problem of impossible magnitude).

Second, that it doesn't solve this particular problem (immediately), isn't a reason to dismiss placing blame. In fact, the act of placing blame doesn't solve problems by itself, usually.

However, putting a spotlight on the data hazards concerning social media may help solve some of the problems caused by pushing the corporations farming these networks to come up with better solutions. The effect on Myanmar will be small (but what do you expect? damage has already been done), but it's totally worth it for the combined effect of putting up a barrier, even if it merely lessens the odds that stuff like these large attack surfaces for propaganda-based exploits of the human psyche can happen anywhere in the world.

And it is currently happening in a lot more places than we are aware of (or perhaps even expect).

It has to be. I've seen technical presentations on security and privacy (from CCC, etc) speculating about these ideas since at least 15 years ago, "What if governments got a hold of all this data and used it for nefarious purposes?". What they didn't always predict was groups other than governments doing that, sometimes implied. Interestingly, I definitely can't remember predictions about it being offered as commercial services by shady analytics companies such as Cambridge Analytica and Palantir. Probably/maybe they're being covered by more recent talks, I haven't been keeping track as much. But it's accelerating the problem in a bit of an unexpected direction. Question, does anyone know, are those services not technically black market? (or partially)

[0] I don't want to single out "data sharing" as the big problematic thing either, there are lots of other dangerous things that can be done with data from user-tracking. In fact, just like radioactive waste, I'd argue it is still dangerous when at rest (or worse, you can't always know if no copies remain).


"A couple of hours outside Yangon, the country’s largest city, U Aye Swe, an administrator for Sin Ma Kaw village, said he was proud to oversee one of Myanmar’s 'Muslim-free' villages, which bar Muslims from spending the night, among other restrictions.

'Kalar are not welcome here because they are violent and they multiply like crazy, with so many wives and children,' he said.

Mr. Aye Swe admitted he had never met a Muslim before, adding, 'I have to thank Facebook because it is giving me the true information in Myanmar.'"

https://www.nytimes.com/2017/10/24/world/asia/myanmar-rohing...


> Mr. Aye Swe admitted he had never met a Muslim before, adding, 'I have to thank Facebook because it is giving me the true information in Myanmar.'"

There are so many things wrong about this single sentence.


Seems like business as usual. Those most vocals about "the others" are always those least likely to have met any.


That is such a sad quote. Just depressing in every way.


I really like how there's a person half a world ago, that you have also never met, and you're sure you're much smarter and understanding than them about their local situation.

What makes you so sure you're the one getting so many things right at once?


The part where they eagerly admit their own ignorance?


I am ignorant on everything Myanmar, I don;t think I can judge the guy. Am I now at fault too?


I would choose a person who admit their own ignorance any day over a person for whom it doesn't even enter the picture.


Every time this comes up I mention: https://en.wikipedia.org/wiki/Radio_T%C3%A9l%C3%A9vision_Lib...

Perhaps in a decade or so's time someone in Facebook will be called before an international criminal tribunal for what's happened there. Perhaps not. There's a bit of a case backlog.


Facebook is the medium, not the publisher though.

The people who built the radio didn't get called before an international criminal tribunal; it was the broadcasters who did.


> The people who built the radio didn't get called before an international criminal tribunal; it was the broadcasters who did.

FB is both though. They built the system & tech (the ‘radio’), and via their algos/policies also control what gets spread and to what extent (the ‘broadcasting’).

It’s not at all clear cut.


Facebook built the internet?


I was recently in a small town in Myanmar where they have public electricity 4 hours a day. I noticed that most of the plugs were used to charge mobile phones, a device that many had never even seen just a mere two years ago.

Now, consider that it's common across South-East Asia to have very basic data plans that allow you to navigate Facebook for free (or at heavily discounted rates). https://www.mmtimes.com/business/technology/20685-facebook-f...

For a huge lot of these subscribers Facebook is the Internet.

People in the developed world who after gone through innumerable Internet phases and fads, and who presumably have more experience and a better sense of its subcultures, still have a hard time to distinguish fake from legit news.

How do you think a small towner in Myanmar who went from zero to Facebook overnight would fare?


That is a depressing thought. From zero to FB. I think the fact that FB has shielded much of the developing world from the Internet at large, gets too little press. We accept FB here because they don't get to cut off the rest of your Internet. Imagine if you only had FB and not the rest of the Internet... horrible.


I think it's better to have Facebook-only than no internet at all. If those are the options, I'd pick Facebook too.


Are you speaking as an informed and responsible user? If so, then you disqualify. A more appropriate outlook to the situation would be to imagine Facebook as the only options that young children have. It's not just about how you use it, it's also about how it indirectly affects the people around you because of the pervasive lack of due diligence. Facebook as the only option might offer some boons, but is it worth it? I think most of us would agree that if we objectively weighed the pros and cons of the "real" Internet some solid arguments could be made that its more positive sides make up for its darker corners in spades. Could Facebook pass a similar test?


Really? What is the facebook-only internet providing on top of the sms and phone calls of feature phones that would make you pick that flood of targeted advertisements? Remember that you can't e.g. follow the shared links on your feed. And the free facebook I have used on my travels does not let you to load images.


To some extent yes. In developing countries (including Myanmar) they are pushing free “access” to the internet via the Facebook app. So for a lot of people in these countries Facebook is essentially the internet.


Interestingly, there's a parallel for this in the Rwandan genocide. A lot of the violence was prepared and fueled by a Hutu-run radio station:

>Radio Télévision Libre des Mille Collines (RTLMC), which broadcast racist propaganda, obscene jokes and music, becoming very popular throughout the country. One study finds that approximately 10% of the overall violence during the Rwandan genocide can be attributed to this new radio station.

https://en.wikipedia.org/wiki/Rwandan_genocide#Preparation_f...

https://en.wikipedia.org/wiki/Radio_T%C3%A9l%C3%A9vision_Lib...

The two founders and one announcer were sentenced to life, subsequently ~30 years in prison, the people who built this particular radio and the broadcasters together


Facebook owns and operates the radio station though


Exactly. The big platforms are getting to mature to get a pass on this stuff anymore.


[deleted]


> people not be allowed to communicate with each other

People should not be allowed to communicate certain things which lead to harm to other people, no. I don't understand why this gets turned into "people not be allowed to communicate" (implying "at all"?).


The picture I was trying to paint, albeit not very well, was that there was no group that we could trust to decide which things people should be allowed to say, and which things they shouldn't be. (I didn't get the point across very well, so we ran in to a race condition after I decided to retract it. Sorry.)

If we think that Facebook should decide what's permissible, then that's US culture being the global nanny. If we let the local government decide what's permissible, then you better watch out because every single genocide in history has involved demonizing the victims as the real danger (so, their speech would be pointed at as "harmful" and banned, the perpetrators would be painted as organizing a just revolution). The worst genocides were conducted with state support, and wouldn't be stopped if local national governments were moderating Facebook. Forming a special panel to decide which groups deserve to get their message out seems like a very dangerous and poorly motivated concentration of power. I can't think of any other ways to decide which speech is dangerous and ban it, therefore I conclude we're stuck with not doing it.


Outside of human based moderation the real issue is that Facebook is deciding what will people see on the feed all the time. People somehow try to liken Facebook to post office, or whatever, like it's some dumb pipe. It is not.

Facebook shows you from the feeds of your "friends" whatever the company decided. Probably based on some algorithms, but it could be millions of cats rolling over the keyboards for it's not public what Facebook really does.

Third parties analyze and weaponize this behavior for many things from comercial interests, to justifying murder, to incitement to violence.

Facebook needs to be held responsible for its algorithms. Somehow. Stupid autonomous car makers will eventually be held accountable for their algorithms too. It's not like they can put some crap in their EULA and be done with it.

PS: Add to it that Facebook is the internet in some places. It's not just one of the websites.


There's one thing I don't understand when people say things like this. Who would enforce this? Well governments would. And so you want our government to have the overt right to spy on people and punish them for private speech that the government will be the final arbiters of deciding right versus wrong on? I mean think about this. History has shown that it's not a question of if but when bad actors take over governments. And even when the actors are not overtly malicious this would be a dubious idea.

Many people today wish we had a more open world, myself among them. Do you know we didn't used to even have passports? There was no 'schengen zone.' You could freely travel between nations as a human. But as a "temporary war measure" passports began to be required in WW1. And once governments had that new level of power and control, they of course not only did not relinquish it but expanded it magnitudes over. In the US income tax was another "temporary war measure." Government is not your enemy, but it is also not your friend. Beware of the implications of granting power to an authority who does not, or at some time will not, necessarily have your best interests in mind.


I do not agree.

People should be allowed to communicate period.

People should not be allowed to harm other people.


Well, which one? Some communications are harmful to third parties.


Can you give an example?


Traditionally libel/slander are the obvious examples, but modern examples include identity theft (A takes out a loan from B in the name of C => harm to C), security breaches (A leaks C's password from B's website), "revenge porn", outing people's sexuality without their consent (can get people killed), terrorist radicalisation (A convinces B to murder C), and so on.


"Well, which one? Some communications are harmful to third parties."

A recent example of this are the ability for conspiracy theorists to communicate news stories after these mass shootings that these attacks were staged or the victims are "Fake". This further pushes harmful propaganda that there is not a real threat to students in schools, and the "third party" children are the ones who suffer from this communication.


It's countries filled with uneducated people who don't understand how to evaluate information that are vulnerable to this kind of propaganda. I think a more fundamental long-term solution is to fix the education and critical thinking problem. After you've done that, you can allow free speech, and democracy for that matter. By default, people aren't competent to safely handle free information because they're so easily manipulated into violence.

For example, see every war ever. In any war, both sides believe they're right, but they can't both be right, so one or more is always wrong. They just can't see it because they can't process information safely.


I would say vanilla education had nothing to do with, if it's that what you mean. Daily, I come across extremely well college educated folks ( including engineers) that are under the sway of Facebook and WhatsApp fake forwards. They were swayed by Radio, TV and Newspapers before, now the medium has changed. The story is the same in the USA or Europe or Myanmar. There are no special critical thinking courses out there where people sign up and learn to "think effectively". People have been swayed by propaganda before and they will continue to be. Saying "education" in this context feels like you are leaning towards saying that only if the people in these "third world uneducated country" learn better, this wouldn't happen. That's not right is it?


> They can't both be right

And this is where you are lacking critical thinking. Single issue wars and issues with only two sides are rare enough that I'm not sure any have ever existed. Its definitely possible for both sides to be right (and wrong) and its definitely normal that educated people won't agree no matter how long they discuss an issue.


I'll permit myself a moment of extreme cynicism and say that two warring parties can both be right even on a single issue.

"I would be better off with that bloke's resources".

I'm not sold on the idea that wars are fought for ideals. There is a lot of coincidental wealth transfer when countries fight. Why else travel long distances to kill people you don't know?


Why can't they all be right?

It's not like there's a big man in the sky telling everyone that wars must only be fought between two opposing sides with identical motives on a logically comparable continuum.

It's not even like wars must even be based on positions that can be meaningfully called right or wrong. Or that most wars can be meaningfully interpreted as "two sides". Even within the allies of world war 2, you would have a variety of factions and motivations, as well as minority players with diametrically opposed views and beliefs.


That would be all of them. There's plenty of people right here in the United States who think Infowars is a real news outlet.


> I think a more fundamental long-term solution is to fix the education and critical thinking problem.

And who is going to do that? Governments have no incentive to do that because lack of critical thinking in citizens is a major factor to ensure they they can get away with exploitation and prevent them from revolting..


"And who is going to do that?"

Private companies try to offer primary education in some developing countries. Since the problem is not only lack of resources, but lack of infrastructure to use those resources (no trained teachers etc.) this is a plausible path forward. I have no statistics of the results. Is it just one more scheme to funnel aid money into investors pockets? I have no idea.

https://www.economist.com/news/briefing/21660063-where-gover...


Currently all educational system teaches students to respect and obey authority. So right now, in our society, people first obey, then only they ll think to question the authority..The primary education from private companies is probably going to be the same.

Teaching students critical thinking, means they are encouraged and taught to ask questions first, before obeying. Governments like a population that just would work obediently and pay taxes without questioning, than a populace that ask questions at every turn. So they would prevent the kind of education that can lead to such a population...


"Currently all educational system teaches students to respect and obey authority."

Well, all public education is based on the Prussian template so what would you expect :)

But seriously...

I think the tendency of the masses to follow the perceived leader is a facet of the human psychology rather than the result of a nefarious scheme.

From what I can tell of the education of my kids in Finland, the teachers really try to walk a tight rope between having small kids behave in an orderly manner as a group and not to stifle individual creativity and will to learn.

You need some authority and and respect of it in schools. Otherwise it would turn into a lord of the flies remake. I presume your critique is directed towards overbearing, paternalistic unnecessary harsh discipline.

I think the amount of this depends on the country you are discussing and and you need to be more specific. All primary education systems are not alike.


>I think the tendency of the masses to follow the perceived leader is a facet of the human psychology rather than the result of a nefarious scheme.

May be to an extent. But you cannot deny that this is exaggerated by social pressure and training from child hood. For example, in most school, the obedience is seen as a foremost quality of a good student.

Imagine the animals in the wild and how they can be tamed by training them in a specific way. You can see how a small man or woman is able to control a large animal such as an elephant or tiger in this fashion.

When you look at that, and when you see how, if the people put their minds, they can so easily revolt/react against the increasingly oppressive actions from governments, and how they don't, just like a tiger or lion can so easily refuse to obey their trainer, but they don't, only because of their training they have received from their birth, and the thought of not obeying just does not cross their minds..

You see, you have to stop looking at the people individually, and have to look the populace as a single entity. Like a reagent in a test tube. You pour in the obedience and compliance and you get more work and taxes out of it (and less unwanted reactions). You pour things like nationalism and patriotism into it, and you get lives to expend in the name of service of nation. You pour dreams of a better tomorrow, and you get votes...


"You see, you have to stop looking at the people individually, and have to look the populace as a single entity."

I think all revolutions are started by tiny groups of inspired people, and the hoi polloi will follow them, or don't, based on the current general mood and the network effects they instigate.

That's also an inspiring thought. To make a change, you don't need to convince everyone. Just a tiny group will suffice, and if the dice fall correctly, everyone else will just follow along.

This, of course applies to both beneficial and pathological changes as well. The coarse grained group mind of the population is not familiar with ethics. That's why free speech and oppressing complete dick heads such as neonazis live in a delicate balance. A small group can cause harm to a society.


Basically, we need to build a society were each member is inclined to think for themselves, and not inclined to just go with what ever that is "out there"..

The funny thing here is that, since such a society won't lend itself to control easily, the governments does not want this. They want a population that can be controlled easily, but then as you said, they can also be controlled by a small group causing major harm, and the government's solution to that is regulating free speech.

So governments wins on both fronts and people lose on the same.


And even if they try, because there is economic incentive (educated population -> better competition on the global market), people will still accuse them of pushing propaganda and brainwashing, and they'll fight the changes.


Reality is often more insidious than that... Most issues are (sadly) not of a binary nature but created through misunderstanding and different evaluation functions between parties. In general, every position has some „truth“ or „value“ to it that other positions are not able to subsume.

For example, immigration in Europe... some people want to make it easier others want to curb it completely. Who is right? There are arguments for both sides and your individual preferences and experiences determine which one you will find more convincing. There is no binary right or wrong over all cases.


I can't tell you which side is wrong, but I can tell you which side is trying to oppress certain groups of people just because they're foreign.


If you are referring to the immigration example, it’s not that easy. By digging into this position you won’t really be able to reach people on the other side who on some level at least - do - have some justifiable grievances with some aspects of immigration. I am staunchly pro immigration but I realize it’s a big challenge for some people and to move forward we should realize that simple heuristics are not going to be the easiest way to the place where we all want to be.


I am from Myanmar. I have many doctor friends and friends with a degree from US having this kind of bias or hatred and trusting all the fake facebook information they want to believe. It is not about education. It is the same in the US. People will believe what they want to believe.


> It's countries filled with uneducated people who don't understand how to evaluate information that are vulnerable to this kind of propaganda.

I see plenty of similar sentiments in my facebook feed.


> In any war, both sides believe they're right, but they can't both be right

If you assume that there is an objective, external moral “right” defined in a very particular kind of way, this may be true, but that hidden premise is far from universally aceepted.


You'd have to solve the language barrier first. Burmese speakers will not be able to read stuff written in Rohingya language.


If you lack access to critical facts, then critical thinking can only do so much. Trust may be the best thing you have.


> After you've done that, you can allow free speech

Who are you do "allow" or "disallow" free speech?

"Disallowing" free spech is actually a human rights violation. Isn't it? So what you're doing here is you're providing weasel moral grounds for human rights abuse.


Do you mind if I e-mail you some writings? I'd like to get your opinion on them. I agree with what you say, and I'm brainstorming solutions to the lack of education and critical thinking / bs filtering [first at home, eventually abroad]



As someone living in Yangon, I can say that this is a very common bias people have including highly educated Burmese. It is unfortunate.


I have yet to meet a muslim man with more than one wife let alone four..Such a lack of understanding of true islamic principles and provisions (and the use of them).


Islamic laws explicitly allow multiple wives and I know people who have multiple wives. Here in India the top body of Muslims (AIMPLB) is now fighting a revision to the law that delegitimizes it (the right to have four wives).

The strangest thing is that there's a sizable number of women who support the law in its current form. Religion is difficult to comprehend.


Religion... or economics. Marriages traditionally were more about economics (and in case of powerful families, politics) than emotions. The concept of marriage being almost entirely a consideration of the people getting married seems relatively modern and western.


Also war logistics


> The strangest thing is that there's a sizable number of women who support the law in its current form.

Different localities have different cultures, and for some, it's good. Say for example, in my locality (Kerala), Muslim men have more than a wife only on certain conditions like:

1. The first wife can't bear a child and the person marries another one without avoiding the earlier wife

2. Marrying widows so as to protect them (there might be occasional cases of exploiting this too, though I'm not aware of any).

I do also have seen a finger countable cases where a man marries some women to threaten other wives. But so far, I remember only two such cases in my neighborhood, and none in my family. Usually such men get really less support from the community.

So women having such an experience shall be supporting the current law.


> Different localities have different cultures, and for some, it's good.

Men deciding what women want is a repugnant thing to do. To use your own examples:

1. If infertility is with the man, can the woman simulataneously keep two husbands?

2. Can a woman marry a widower to take care of him while being in a marriage?

Pretty sure this is blasphemous in Islam.


When you comment about culture, be sure you know something about it.

It's pretty much hard to get a woman married once divorced (pretty much every village in India at least, to say). But for males that isn't usually the case.

And considering Islam, it is men's duty to protect and feed his women, not the other way. I think this is true for all Abrahamic religions.

In Islam, this is even true when the women is rich. That is, the ownership of women's property won't change before or after marriage. The man have to provide food and shelter to his wife and children regardless of the women's wealth. Women is free from such duties. There are also other issues like inheritance (finding the true father of child), or considering the spouse equally (to the extend that a person with two wives should spent every alternate day which one wife and the other day with the other. This isn't possible when a woman marry two man, where there is pregnancy period, etc.)

And to answer your question: No. Islam doesn't allow that.


We're moving to a world where women will be equal partners in the workforce, even in India. We should take steps to achieve gender equality faster, and religion steeped in centuries old customs isn't helping.

Women's rights are an absolute, irrespective of whichever religion people belong. If that means striking off what's regressive from the holy books, that's what liberal believers need to do. The sooner we get to Uniform Civil Code, the better.


well, you are imposing your liberal view on others by imposing uniform civil code which will be modelled possiblt after the majority religion in country (india) ?


I think your interlocutor is just trying to show why some Muslim women support polygamy. And there certainly are such women. We can't just dismiss an insight into why that is the case because you dislike the situation, no?


So, to sum it up: it is misogynistic.

(Same as most Abrahamic religions, since you mentioned those.)

And no, "it's our culture" is not an excuse for that.


>>Here in India the top body of Muslims (AIMPLB) is now fighting a revision to the law that delegitimizes it (the right to have four wives).

Polygamy is common is all major religions in India. Also it can be argued in most Indian cases having a mistress is polygamy too.

You will also be a tad little surprised to know Muslims are the least polygamous community in India.

https://scroll.in/article/669083/muslim-women-and-the-surpri...

>>The strangest thing is that there's a sizable number of women who support the law in its current form.

Polygamy has/had its uses across history. Also if you don't mass murder girls like they do in India, you inevitably come up with use cases for things like these.

>>Religion is difficult to comprehend.

Its isn't. Religion is social evolution. That a few things don't apply to one person, doesn't mean they don't apply to billions of people.


That article is disingenuous in taking the 1961 survey - the Hindu Marriage Act was just enacted 5 years in 1955 and made polygamy illegal for Hindus. Figures taken today would be vastly different.


Legality of anything is hardly a problem in India.

For that matter other religions register under special marriage act in which Polygamy is forbidden too.


There is no State Legislation in India governing Muslim Marriage. According to Muslim Personal Law (Shariat) Application Act 1937, the subject of Marriage is included in the List of subjects (Section 2 of the Act) on which Courts will apply only Muslim Personal Law, as a rule for decision where both the parties are Muslim. Muslim Personal Law permits multiple marriages for the Male and it is not illegal for a Muslim Male to marry a second time during the subsistence of his first marriage.

https://en.wikipedia.org/wiki/Muslim_Personal_Law_in_India https://indiankanoon.org/doc/1325952/


> Islamic laws explicitly allow multiple wives and I know people who have multiple wives.

Sure, but how common is it really? Are you talking 3% of Muslim men? (in which case it's peculiar to regard as a particulalry Muslim thing) Or 93%?

And what, specifically do we object to about it? I'm against polygamy only in that (at least in situations I know more about such as the Fundamentalist Latter Day Saints movement, or FLDS) it seems very heavily entwined with abuse of power and/or underage marriage, and cults. If polygamy was not coupled with those things then I would struggle to identify on what basis (if any) it is objectionable.


The number would be less than 3% because it'd be very difficult to actually pull it off. But the point is that a sizable number of people (mostly men, but women too) believe it should be legal, even if they don't do it themselves.

I completely agree that polygamy by itself isn't objectionable. What's objectionable is allowing it only for men.


What is this based on? Are you Muslim or do you have direct experience with the practice of Islam in India?

We just read about false information being spread about religions, and about Islam in particular. Let's be careful to back up what we say.


A few years ago, I was tutoring a high school student. Her father had two wives (at least, but I think two is more likely).


you are lucky enough to live in a country where polygamy is outlawed.


[flagged]


To be fair, it was normal at that point in history at many parts of the world to have multiple wives (or one wife + multiple concubines).

That and the prophet was actually monogamous for 25 years (until his first wife who was 15 years older than him passed away).


Every religious scripture has many things that are barbaric by modern standards. This comment really is out of place.


[flagged]


Religious flamewar is not allowed on Hacker News. Please take/keep this elsewhere.

https://news.ycombinator.com/newsguidelines.html


Nobody is "flaming", just pointing out facts. I didn't start the argument though, so I suggest you clean up from the root.

It's amazing how biased you are in this.


Your comment was obviously religious flamewar in the sense that we ban accounts for. We try hard to be unbiased, but when someone has just broken the site rules egregiously, such claims are a little hollow.

You unfortunately have a history of violating the HN guidelines in comments. We've asked you several times to stop doing this. If you can't or won't stop, we're going to ban you. Please (re-)read https://news.ycombinator.com/newsguidelines.html and use this site as intended.


Bahai would probably be an exception. About the worst they do is be slightly sexist.


I believe the reason for this was to protect the women. Not sure though.


This comment is completely irrelevant to the post. It seems to emphasize and sensationalize the underlying bias leading to possibly fake news on facebook.

My point is - HN is largely about tech. Not about bias against or for a community. So STOP making it about communities and religion!


I don't think it's fair to think that tech companies are changing the world, but then try to put the changes themselves off-limits for discussion.


Tech companies are changing the world by building tools.

What people use these tools for should be discussed on the context of the people. Discrimination didn't start because of Facebook and won't stop if/after it leaves.

This is a problem of moral education, and I personally don't want Facebook to be the moral compass of the world. This is something the government and society of Myanmar need to work against and figure out in sovereignty.

Are we suddenly implying someone in a first-world rich western country should interfere with the politics of an independent southeast asian nation again? Please.


> someone in a first-world rich western country should interfere with the politics of an independent southeast asian nation again

They already have. By deploying Facebook. And especially Facebook's decision to subsidise internet data use for customer acquisition.


'I have to thank Facebook because it is giving me the true information in Myanmar.'

What do you need, a flashing neon sign reading “tech” to be interested? He’s directly pointing to one of the 5 big Tech firms, it could not be more relevent. Given that, what’s your problem?

You can argue a lot of things about this, but “not related to tech” isn’t one of them.


My problem is that he's focusing on the bias against a community, and not on Facebook's unique role in this. We don't have to turn HN into another activist platform for social justice, the internet is filled with such.

Facebook seems to be the most viral/effective platform for spreading any (fake or true) news, and in this instance, the news was possibly harmful to a community. If not facebook, the perps would have relied on the next most viral platform.

That still doesn't explain his cherry picking of comments that speak to bias more than tech. So, no - his comment is NOT relevant to tech!!


The effect of tech matters. It always has. We spent much of the middle of the 20th century working out the downsides of chemical technology, and the ever present risks from nuclear physics; digital tech can have its toxic waste spills and weaponisation like everything else.


Every time this even becomes a question, I also feel compelled to point to the same things.

Including one of my more important figures in those questions surrounding chemical technology and nuclear physics, paraphrasing an old Buddhist proverb:

---

“To every man is given the key to the gates of heaven. The same key opens the gates of hell.

And so it is with science.”

Richard Feynman

---

I'm continually baffled, though less surprised lately, how many people still think that they can shirk all responsibility. It's important to take time to consider what effects your work can have on the world, and not think anything and everything justified in the name of making money. At some point, we're all in it together.


Responsibility is hard, and most people are machines designed to find paths of least resistance.


You know how it goes, “It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

It’s all over this place, regrettably.


You're completely wrong. Facebook's stock has crashed in the past week - perception very much matters. While virality is facebook's USP, there's no evidence that an alternative viral tech option would not have resulted in a similar outcome.

It's convenient to blame facebook, and ignore the goverment's actions. And it's convenient to ignore the fact that facebook is infrastructure, like roads and electricity. When bad people use roads and electricity, you do not cut it off for the rest of society, or blame the roads for bad things


Facebook isn't infrastructure like roads and electricity because it's not neutral. I don't hate FB, but it's a privately owned platform. Real infrastructure uses protocols, not platforms.


> When bad people use roads and electricity, you do not cut it off for the rest of society, or blame the roads for bad things

No – instead, the government passes regulation to minimize the number of people that die on the roads. Your car needs to meet safety standards to drive on the road, it needs to be registered with the government, and you need to be licensed to drive it.

Things that become interwoven in the fabric of society don't automatically just get a free pass. I would say the opposite happens, actually.


Agree this is about tech, but then the other parts of the hate speech or misinformation on a community shouldn't be posted here, as it may leave a false impression..


https://docs.google.com/document/u/1/d/e/2PACX-1vQgMzRBc6P2m...

As someone who spent a lot of time in Myanmar in the past 6 years, I saw the country go from near zero internet to 50-60% penetration in a couple of years. The strange thing is that most people don't really use the web - it's all Facebook. The internet, for a vast majority, means Facebook. The way news spreads in Myanmar is gossip and rumours - probably because of very limited press freedom in the past. This combined with some ugly undercurrents of nationalism, extreme poverty (for all ethnicities) in the conflict areas, and decades of poor education, has sometimes turned Facebook into an amplified medium of hate speech.

I don't think Facebook is to be blamed for the violence (it has existed, and still does, in many parts of Myanmar against other ethnic minorities - Karen, Kachin, etc - before Facebook, and without the same amount of press given to Rohingya) but it has most likely amplified the hate speech.


It's the same dynamics as elsewhere in the world, just more clearly visible due to extreme effects. Social media brings the fringe that had no platform before and unites them to the extent they are able to effect change. From policy change to outright genocide.

Traditional journalism/media always get a ton of criticism, but for all it's flaws they at least strived to have basic standards (fact checking, neutral language) and do their best try of objectivity. With the media circumvented this decade, the crazy grandpas of the world united.


This comment from another subthread is relevant to your observations:

https://news.ycombinator.com/item?id=16743502

Partial quote for your convenience: (read the full comment for sources and more).

> FB stopped being a "data sharing platform" when it muscled its way into becoming synonymous of "the internet" in much of the underdeveloped world. In many of these countries (including Myanmar) people can get a really cheap plan that offers next to nothing beyond "Facebook Free Basic", FB's walled garden (e.g for 3$/month you get 100mb of data + unlimited facebook). This is effectively a two-level Internet with FB being the obvious default.


I think a big problem is that the modern communication platforms provide the power to assemble public anytime anonymously, and as a society we are yet to realise that. In a real world gathering, you have media and government obseevers, and anybody making incorrect statements is most of the time scrutinized and exposed.

However, now that anyone can publish information fast and to a large number of people, false facts will circulate fast on any communication platform.

Here in India in my village, people forward each other long false statements and rumours, and people believe it blindly and propaagte to their contacts. As a society, I don't think we know how to handle this.


> In a real world gathering, you have media and government obseevers, and anybody making incorrect statements is most of the time scrutinized and exposed.

Yeah, we've never had genocidal movements whose propaganda was spread, among other means, by in-person rallies or other such assemblies, precisely because “real world” gatherings have the kind of real time fact checking you discuss.


I am not saying that at all. I am just saying the online gatherings we have now are unprecedented both in their size and in their numbers.

Imagine an in person rally of 10k people. It would be a huge event both and get huge coverage. Bigger rallies will have even more. So some local media or local government has a chance to counter the falsehoods and make people aware. Governments may even choose to not allow some gatherings in some parts of the world if they fear they will cause a significant law and order issue.

Also, offline gatherings depend on people propagating the issue. This would have previously required people brain washed or dedicated for the cause. Now even normal people can forward anything to all theie WhatsApp contacts and boom, the chain goes on.


> Imagine an in person rally of 10k people. It would be a huge event both and get huge coverage. Bigger rallies will have even more. So some local media or local government has a chance to counter the falsehoods and make people aware.

In Myanmar, as in most other cases, the government was an active belligerent; they wouldn't have been a check in any case. Genocides are usually conducted by groups that are already dominant in government, media, and society in general, so the kind of social institutions you point to are usually subverted before rallies (physical or virtual) directed imminently at violence occur.

So, it's quite unlikely that those institutions will check in-person rallies; more likely they will be organizing them and spreading their message.


Indeed. One of the most horrific modern genocides, the Rwandan genocide, was organised via radio broadcasts and the state's police and military.


Okay. But even with the government as part of the gathering the premise remains the same. Even for the government it is vastly easier to use the new communication platforms to spread hate and solidify it's grip on power. I am not saying the government couldn't use posters and banners and newspapers ads, I am just saying the new tools make it vastly easy and provide the government a shield that they didn't know about it.


you're correct that there's a similar process in real life as online. the issues are that online (a) these effects are more automated and distributed online than in real-life (Toyama nicely summarizes this as "technology as amplifier" [1]) and that (b) the methods of automation and distribution are (by design and intent) less visible online than in real-life.

People have little ability to change A, but could exert influence on companies to combat B if sufficient pressure were exerted...

[1] https://www.theatlantic.com/education/archive/2015/06/why-te...


When it’s Tunisians and Egyptians using FB and Twitter, everyone is quick to take credit and talk about the “transformative” power of social media. When it’s genocide, suddenly everyone is mumbling and looking at the ground. You can’t have every job listing include “changing the world” and then duck responsibility when you actually do change the world, just for the worse.

Hypocrisy is an untenable position. Watching people argue that social media is equivalent to rallies, on HN would be funny if it weren’t so desperately sad. You can make this comment disappear, but you know I’m right about this. Even a brief glance through comment histories and submission histories bears it out.


We've made mass communication wildly more efficient without education or credulity making the same leap so it becomes possible to share a twisted and false version of reality to suit one's needs. Or hatred. My time on the Internet has been one of constantly remembering this[0] image from the old comic Nexus where he is forced to modify the repository of truth in order to stop violence. I'm guessing something similar will be required here and then we'll need to come to grips with the fact "reality" is a pretty local concept, or perhaps a concept whose diameter is proportional to one's education/ luck.

[0] https://imgur.com/a/3dhxt


It still astounds me that many people I know are unable to decouple the source of the information from 'Google'. It usually goes like this: someone I know will state some odd claim about some natural treatment for something, I will ask for a source and invariably get the answer 'Google'. I will then try to use the radio vs. radio station analogy and then ask for the source again, to be often told that they don't remember where the claim came from. Facebook seems to be the same: I will often be told about something someone 'read on Facebook' but they are often unable to say who posted the information.


> media and government observers, and anybody making incorrect statements is most of the time scrutinized and exposed.

This is not true. When groups of people get together in public, they're free to state falsehoods. Big Brother-style government oversight is not the answer to the "problem" of that freedom.


They are free to state falsehoods. However, gatherings of even a moderate size ger covered in local newspapers, which would analyze and publish their own perspectives. Generally, the larger the gathering, the higher the scrutiny.

However the new platforms allow everyone to create very large gatherings anytime. Both the size and number of these gatherings are very large, and so are their scopes. It becomes impossible for people to keep track of the content discussed in these and people then go by whatever they feel good to believe in. On a country level, this can cause huge changes, both good and bad.


Generally, racist/genocidal falsehoods are used to attack a vulnerable minority.

The interests of said minority are typically un-represented in local newspapers. The papers are often the second to cheer-lead this sort of behaviour.


On the contrary, historical tweets are often systematically picked apart.


It's incredible how fast the sentiment has changed regarding Facebook. Back in 2010/2011 Facebook was being praised for fueling the Arab Spring, which toppled dictators and spread democracy (also lead to the horrific war in Syria).

Today, stories bounce from fake news, to Russian propaganda, to user data mishandling, to fueling genocide in Myanmar.


> Back in 2010/2011 Facebook was being praised for fueling the Arab Spring, which toppled dictators and spread democracy (also lead to the horrific war in Syria).

Correct me if I'm wrong, but I thought that was Twitter as the major player RE: Arab Spring.

I think 8 years is a more than long enough time for backend changes in a platform to change sentiment. Also, the platform may not have been used for as many nefarious purposes in the past.


I visited Facebook during this period, and the whole company was celebrating their role in the Arab Spring, which was large as well.

The rhetoric around this has shifted, not because Facebook has done anything different. Or because subsequent uprisings have been any different.

But those mini revolutions of the Arab Spring were not what we thought they were. "The People" taking power from the dictators, created the society they want, but it is LESS in line with liberal values than the previous one. This is what we failed to realize at the time.

Facebook has done an amazing job empowering the will of the people. Which we are still inclined to view as positive.

But when the majority of people WANT a government that is less tolerant of other religions than a dictator would be, what should we do? Should we blame Facebook? Or should we try to use Facebook to counter the hateful ideas that are being spread?

How trivial would it be to make sure this leader in the article who 'never met a Muslim', was forced to scroll through a feed with positive messages about Muslims and religious tolerance?

If Ghandi or Martin Luther King Jr. were alive today could they sway our elections with a billion dollar ad budget? We can't force any other group to live by our values. But we can promote what are values and ideas are, in a way that might be help them spread.

Maybe we should ask Facebook to help with this, instead of trying to take down the whole network just because some groups are using the platform in a way that we don't like.


> But those mini revolutions of the Arab Spring were not what we thought they were. "The People" taking power from the dictators, created the society they want, but it is LESS in line with liberal values than the previous one. This is what we failed to realize at the time.

This is tragically true. A revolution is a re-spin of the the roulette wheel when it comes to outcomes, the result is absolutely not guaranteed. In the case of Egypt, the semi-secular military dictatorship was overthrown and became .. an Islamic theocracy, which was actually worse for the average woman on non-devout Muslim, and the ensuing counter-revolution back to military dictatorship was reluctantly acquiesced to.


Revolutions have a really bad track record. For example http://journals.sagepub.com/doi/abs/10.1177/1043463970090030...


>How trivial would it be to make sure this leader in the article who 'never met a Muslim', was forced to scroll through a feed with positive messages about Muslims and religious tolerance?

So when the users don't conform to the world view of the platform operators, the answer is platform sponsored propaganda? That is incredibly dystopian.


> So when the users don't conform to the world view of the platform operators, the answer is platform sponsored propaganda? That is incredibly dystopian.

Advertising is propaganda dedicated to getting you to conform the advertiser's preferences, and it's already targeted however the advertiser prefers, which could be by lack of conformance to those preferences.

You are basically suggesting that the platform operator doing themselves what they already build their business around allowing anyone with a checkbook to do is dystopian.

I would suggest it is no more dystopian than the status quo.


> So when the users don't conform to the world view of the platform operators, the answer is platform sponsored propaganda?

Facebook sponsors propaganda all the time. Its algorithms favor certain posts and messages over others, based on a long list of attributes, including ad dollars. But apparently that's OK, while suggest that Facebook display some positive third-party post about Muslims in Myanmar - while a genocide is underway - is suddenly "platform-sponsored propaganda"?


I don't support any of the propaganda that Facebook pushes. This also doesn't have anything to do with ads. The issues is that when these mega-platforms decide that they are going to promote one side of a political issue, and suppress another (as they have been doing more and more), this comes at the cost of freedom of expression, and all you're left with is corporate approved discourse.

Of course this is much easier to rationalize in this situation. However that's how you end up giving away liberty. You give it up in little pieces in response to extreme situations, then when you go back about your life you don't get it back.

The real issue here is that this gets the whole problem backwards. We shouldn't be looking at a state-sponsored genocide and then claiming that "if only Facebook had more control over public discourse then we'd be able to solve this problem". The "problem" in that statement can be anything from this genocide taking place, to your preferred candidate losing an election. "We need more propaganda" isn't going to solve any of that, and in reality it's just a veiled power grab by companies that wish to control public discourse more effectively.


> We shouldn't be looking at a state-sponsored genocide and then claiming that "if only Facebook had more control over public discourse then we'd be able to solve this problem".

The genocide in Myanmar has both state-sponsored and spontaneous characteristics. Facebook can and should help with the latter, if it's going to allow folks to pass around pro-genocide messaging using the site.

In the end, I advocate for everyone to abandon all large-scale, centralized, corporate social media, but given how unrealistic this goal is at present, my next hope is for the large social media companies to assume more responsibility for their actions. The same standards that we've traditionally held all media companies to.


But why is propaganda your preferred option over traditional content moderation? I doubt many people would considering it controversial to censor calls to violence. However when you start to focus on "hate speech" (which is where this line of reasoning will take you) and add propaganda to your moderation toolkit, then you start to cross the line into full blown dystopia territory.

Again, this may be a more compelling example of how you might define "hate speech". But if you think facebook has a responsibility to start moderating speech based on its perceived "hatefulness", then you're still going to end up at the same destination, where genuine public discourse is gradually replaced with corporate approved discourse.


You think 'content moderation' is less dystopian than 'propoganda'....

I think you are hearing something I didn't say.

I'm just talking about promoting some TED Talks. Making sure everyone can name at least 1 good Muslim. Maybe trying to raise the profile of the people who should be winning the Nobel Peace prize.

If someone is IN a hate group. I would like them to have some limited exposure to the views of the other side. I don't think that's propaganda.

I think 'content moderation' has a much higher potential for abuse, and a nightmare future.


...except it didn't really spread democracy, at least not in a particularly long lasting way. Maybe what we've learned is that fueling populist uprisings isn't something that is particularly good, and that seems to be what Facebook is good for, politically.


It did make Tunisia a bit more democratic since Ben Ali left, which is where everything started. But Egypt, where all the US reporters rushed to witness the "revolution", is back to a US backed dictatorship, Syria is in shambles, Libya is in chaos, Nothing changed in Morocco or Algeria. And curiously nobody on Facebook or Twitter was talking about the violent revolts inside the gulf monarchies that were harshly repressed by their respective governments.


The way you describe it, it's as though American social media companies were used as tools to further the geopolitical objectives of the United States and its allies.


I had to look up populism to make sure:

>A political philosophy supporting the rights and power of the people in their struggle against the privileged elite.

The idea that the government is going to go after Facebook because it is used to fuel the rights and power of the people in their struggle against the privileged elite is very scary.


You need to read the rest of the Wikipedia article you quoted withoit attribution, and he recognize that a frequent use of “populism” without qualifiers is to refer more specifically to authoritarian populism, particularly emergent authoritarian populism built around a mythologized virtuous people opposed by an equally mythologized evil “elite” (which often is not an actual elite, just a recognizable group around whom a conspiracy theory of behind-the-scenes power has been constructed.)


That's funny, I use DDG and when I searched "define populism" that's the result that came up from DDG itself, which they say came from Wordnik.

https://www.wordnik.com/words/populism

I disagree with your sense of unqualified populism implies authoritarian populism. In my experience it's used without qualifiers to describe both Trump and Bernie. I suppose whether you think those two are authoritarian probably depends a lot on your own political leanings. But generally it means "fight the power." Or in Trump's case, "drain the swamp [0]." Here's a NYT article that, having skimmed, seems reasonable to me.

https://www.nytimes.com/2016/03/27/magazine/how-can-donald-t...

[0]: This comment is not intended to confer any positive feelings towards Trump or imply that he did anything other than make the swamp even swampier. It was only in reference to his rhetoric


> I suppose whether you think those two are authoritarian probably depends a lot on your own political leanings.

I think both the Trump and Bernie movements have some frightening populist characteristics. But between the two leaders, only Trump has been broadcasting blatant authoritarian messaging.

That's not to say that Bernie would never engage in any authoritarian action if he were elected, but he's not made overt any authoritarian tendencies like Trump has.


Perhaps a simpler definition of populism is to acquire power outside of traditional state institutions. What makes this a bit frightening is that those institutions cannot regulate the worse excesses of that leader.


How is that different from a democracy that was founded in opposition to monarch or dictator like the USA?

From the wiki on Populism:

>Populism is most common in democratic nations and political scientist Cas Mudde wrote: "Many observers have noted that populism is inherent to representative democracy; after all, do populists not juxtapose 'the pure people' against 'the corrupt elite'?".[6]


The populist rhetorical device of directing mass public hatred against a particular target is used at least as easily by dictators - aspirant or actual - as against them, particularly when it relies on open communication to flourish.

Sure, populist upheaval can lead to American Revolutions. But it can just as easily lead to pogroms, and one of those patterns is repeated far more often throughout history than the others.


[flagged]


We've already asked you to stop using HN for ideological battle. It's not what this site is for! Would you please read https://news.ycombinator.com/newsguidelines.html and use HN as intended from now on?


Empathy and self-control have been the internal enforcers of the civilizing process, a process of which many peoples across the world are constantly struggling with, and the nature of Facebook is such that those internal enforcers are broken down with such swiftness that our current situation is inevitable.

The mantra of "move fast and break things" has driven us off a cliff, where our moral backbone is what has been broken.


Wasn't that Twitter to the point the State Department contacted them about moving a planned service outage?


I think a lot of that praise in American media was an extension of neoconservatism that permeated the administrations around that time


From the article:

"He recalled one incident where Facebook detected that people were trying to spread “sensational messages” through Facebook Messenger to incite violence on both sides of the conflict. He acknowledged that in such instances, it’s clear that people are using Facebook “to incite real-world harm.” But in this case, at least, the messages were detected and stopped from going through."

Notable: "through Facebook Messenger"

So one extrapolates that FB actively monitors private conversations carried on Messenger.


Probably they are monitoring spam. As in it not end to end encrypted, they sure have access to messages.


Anecdotally, I often receive ad targeting topics only mentioned in messenger conversations.


That's how they suggest stickers and responses too, I'd assume.


> "there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place"

> Facebook's systems detected what was going on and stopped the messages from going through

Wwwwhaaat?! Some people may have just seen that message and interpreted it as "shit hit the fan, let's hide my family in a safe place until this cools down", even if it was intended as a "call to violence". Censoring messages like those could have just as well caused deaths because innocent people just didn't get the heads up.

Corporations should clearly define themselves as either "medium companies" and stay completely neutral to whatever flows through their platform as long as it not "explicit content" (yes, this include allowing "hate speech" as long as it's toned down, because that "hate speech" can also contain useful information, and it's not something clearly identifiable), or "message companies", in which case they can clearly take sides in conflicts, but also be responsible (legally) for their actions.

This muddy "middle ground" position that some companies take is "the root of all evil". Either let anything happen (including bad things), or pick a side, so that you can later be judged according to the side you picked. It's condescending to imagine that you're actually smart enough to "properly filter" information. You're not, or you're a tyrant imposing his value system on others.

I have more sympathy for a corporation that does evil deeds in the service of profit, than for one that interferes in "muddy" ways in social issues and prevents clarity and free flow of information. Sometimes this flow of information cause blood to be spilled, but sometimes problems get solved this way, if a society is not evolved enough to solve them in more peaceful ways. Toning down discussions and letting tensions accumulate is worse.


I once reported a group created solely to harass & defame my FB friend. The creator of the group clearly violated multiple Facebook policies, 3.3 You will not bully, intimidate, or harass any user, 3.9 You will not use Facebook to do anything unlawful, misleading, malicious, or discriminatory, 5.1 You will not post content or take any action on Facebook that infringes or violates someone else's rights or otherwise violates the law.

Facebook reaction? They found it doesn’t violate their community standards: http://const.me/tmp/fb-policy.jpg

Right, “no place for hate speech”.


I once reported a group named "Assassinate Donald Trump" and got a similar "doesn't violate their community standards" generic message.

I wonder what is the threshold for reports before an actual human looks at them, or what keywords will get Facebook's attention. Perhaps a bunch of people need to report simultaneously.


Should have told them the community had conservatives posting in it. It would be banned in record time. I’m only partly joking.


This happened to my “snowflakes for trump” group


There are plenty of things that Facebook can definitely get rid of seemingly every time: spam (advertising you've not paid for), female nipples (breastfeeding controversy passim), competing social networks ( https://www.wired.com/2015/11/facebook-banning-tsu-rival-soc... ) and so on.

Hate speech is just not a priority in the way spam is.


Getting rid of nipples is easy with a deep neural net pornography image recognizer. I think the guys at Facebook don't want to deal with the exceptions so they just don't and say oops, community guidelines.


I'm not sure I believe that, but if so, why is a neural network hate speech recogniser so much harder, then?


Because hate speech involves understanding intent. In the case of breasts/nipples, they can have a blanket rule against it. Thus even an artistic image with breasts would be removed (not just pornography).

If they did that with speech, then they would censor an insane amount of sarcastic speech as well. The false positive rate would be really high and that would be a chilling effect on speech. Similarly, the actual hate speech would find ways to use new coded terms and nuance to get around the filters.

Imagine if instead of "no female nipples" they had a policy of no overly sexual images. That would be way more subjective and harder to enforce. My bet is that is the case with text based speech.


Unfortunately for this hypothesis the current facebook nudity policy is full of intent checks and subjective criteria: https://www.facebook.com/communitystandards#nudity

"We also restrict some images of female breasts if they include the nipple, but our intent is to allow images that are shared for medical or health purposes. We also allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures, and other art that depicts nude figures. Restrictions on the display of sexual activity also apply to digitally created content unless the content is posted for educational, humorous, or satirical purposes"


These are mostly image classification cases, though, aren’t they? not to mention these rules are partially enforced through human moderation, same as text.


Because properly understanding speech requires sophisticated cognition. No one has any clue whatsoever how to solve that. And political speech is one of the more difficult domains in NLP due to prevalence of innuendo, sarcasm, and non-sequiturs.


Facebook made it deliberately difficult and obfuscated to report violations as well. Twitter by comparison makes it relatively frictionless. I don't use Facebook except to follow some news sources but I now follow them on Twitter and barely use Facebook. I know one is supposed to assume incompetence where malice is seen but gosh, I can't decide where Facebook falls yet.


I think the point the parent is making is that standards are being applied unevenly.

On Twitter, for example, there are a host of accounts that have been banned for seemingly harmless activity while other accounts with much more serious issues remain active.

Despite it being easier to report violations on Twitter (taking your word for it...I have no personal experience), standards aren't being applied fairly. That undermines the whole point of standards.


Facebook is notorious from both ends of this, it seems. I've had a Facebook marketplace post taken down with no explanation other than a copy of the guidelines, I appealed, and now my marketplace posts simply don't get seen by other people, even if they search for them. I've had a friend's account get suspended for a meme that pissed some people off, while a legit neo-Nazi group that had Swastikas on the page I had reported months earlier for making comments about the extinction of the pure white race was still up. It seems they only police enough to claim they do. They don't actually do much that anyone is happy with.


You'd think something like a simple downvote button that lowers the probability of the post being seen would kinda work.


I would think that's ripe for abuse. For curating my personal feed, I've been fairly content to unfollow people I want to remain friends with and hide posts shared from specific accounts. I don't actually see neo-Nazi posts routinely - but they're clearly less banned from using the platform to communicate than my attempts to sell used sports equipment are. And I honestly wouldn't care if Facebook left their page in peace in the name of free speech and diversity of opinions, but then at least have the decency to tell me why I can't post on the marketplace. Like I said - it's the claim that they police content - it only inconveniences and confuses people instead of actually getting rid of bad content.


“It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

https://quoteinvestigator.com/2017/11/30/salary/


I just checked and they seem to have made it easier. Click the "..." at the top right of the post, click 'Give Feedback', click 'hate speech' or whatever.


I checked the pic, is your friend the Russian political blogger / Putin critic with virtually the same name who writes about conspiracies? If not, maybe just a case of mistaken identity.

If it was him, maybe he fell under some clause of notable persons. I don't know what the content was like of course, but that could be one reason they didn't immediately delete.

Did they eventually remove or did the creator shut it down?


> is your friend the Russian political blogger / Putin critic with virtually the same name

Yes, he is.

> who writes about conspiracies

No, he doesn’t. He’s an investment banker, and he mostly blogs about the consequences of Putin’s external politics (i.e. the ongoing wars in Ukraine and Syria) for Russian economy.

> Did they eventually remove or did the creator shut it down?

I don’t know what happened to that group.


Humans make the decisions, and humans make mistakes.. Humans don't always "get" all the context for every issue. Do you really expect perfection for every judgment? There are anecdotes for every scenario, and they don't accurately portray the general case or the intent.


At that time, too many Facebook’s mistakes were strongly in favor of Russian government.

https://www.theverge.com/2014/12/22/7433277/facebook-blocks-...

https://www.theguardian.com/world/2015/jun/17/facebook-pro-k...


These aren‘t isolated incidents. I have yet to report a post they find to cross the line. And I only report extreme cases, maybe two or three posts per year.


Unless there's a woman's exposed nipple. Those get taken down on time, every time.


You are still an anecdote. You know this, and you also know that people get posts taken down every day, by the thousands of people working to moderate content.


We are legion. You‘re the first person I‘ve met who claims that FB‘s moderation works well, modulo the inevitable mistakes that people make.

Germany even passed a (admittedly very bad) law, mostly because of FB’s unwillingness to take this problem seriously.


I said it wasn't perfect, and that it can't be perfect. I shouldn't be surprised that avoiding hot takes and trying to be reasonable is rewarded with downvotes on HN.


Ok, so then what? We should tolerate a certain number of bullying groups and casualties for the greater good?


We should fight against, but within the framework of free speech.

It's just as easy to suppress malevolent ideas with censorship as progressive ones. Handing that power to an elite few, however noble their intentions at first, is dangerous.

In this case though, it does appear they made a mistake.


What's the alternative? Can you go on Twitter and find bullying? Can you go on Reddit and find bullying? Is there any internet property where moderation is 100% perfect?


Examples that are equally bad should not set the bar, right? Nothing gets better by doing that.


Is there a better example? Is the concept of perfect moderation even theoretically possible?


> Is there a better example?

No one is perfect, but I think Reddit is better.

At least they are transparent, i.e. unlike Facebook, Snapchat or Twitter they aren’t afraid to talk to journalists about their content filtering: https://www.newyorker.com/magazine/2018/03/19/reddit-and-the...


We should fight bullying as a society, not attempt to hide it.


I have never seen any other reaction, the few times I‘ve actually reported posts.

They even soften it with something along the lines of „you were right to report it, but...“


This FB Mynamar issue reminded me of the looming crisis at Telegram:

'Telegram has told Russian regulators that it is technically unable to hand the encryption keys to user accounts to the country’s secret services, just weeks after the messaging platform was ordered to do so or risk being banned in the country Roskomnadzor, Russia’s communications watchdog, told the company last month that it had two weeks to give the FSB, successor to the KGB security agency, access to the company’s encrypted messages or face the possibility of being blocked'. https://www.ft.com/content/84a878da-3664-11e8-8b98-2f31af407...

Iran (where Telegram have 40 m users) is on the verge of banning it too as it was 'used to organise mass protests last year'.

We seem to have gone from FB being an enabler and hero of 'arab spring' to now being accused of being a tool of darker forces against states. Telegram have raised over 2 billion USD (of probably dodgy money given terms of ICO)and may now be crippled by state interference...


> We seem to have gone from FB being the enabler and hero of 'arab spring' to now being accused of being a tool of darker forces against states.

Well, (1) you may have confused FB with Twitter in regard to the Arab Spring (both were involved, but Twitter seemed to get more credit at the time), (2) the Arab Spring itself was a set of anti-State uprisings, and (3) in Myanmar, the atrocities FB is accused of facilitating were committed by, rather than against, the State.


In regards to point 1, Facebook was indeed lauded at the time as a catalyst for the Egypt revolution (more so than Twitter). A good illustration of this was the Egyptian guy that named his child “Facebook” in honour of its role in the uprising.

http://edition.cnn.com/2011/WORLD/meast/02/21/egypt.child.fa...


I'll change 'the' to 'a' as both FB & Twitter were lauded in the western media.


Your initial statement was correct.

In Tunisia -- the spark that arguably started the Arab Spring -- Twitter use was (and still is) virtually non-existent when compared to Facebook use.

Twitter is huge in KSA and some of the Gulf, though.


A slighly alt-analysis here.

These hatreds (if you will) are not new, at all. They have been "mismanaged" and/or conveniently exploited for as long as any of us can remember.

Certainly, the communications tool (aka FB) can play an enabling role. That is, none the less, a symptom; a symptom of a disease that predates the tool by eons.

The UN is confused and distracted; and it seems willing to let - once again - the true guilty parties off the hook. Yes, FB played a role. But to ignore the historic context is silly and dangerous.

The disease will persist. Because it can. Because it's easier to blame a symptom.


This whole concerted barrage of negative stories about Facebook seems more like the states trying to craft a specific narrative and influence the populace so they can censor the Internet.


I expected down votes, of course :)

But the lack of explanation for how and why FB should be held accountable for history is a bit disappointing.


It seems more likely that the people of Myanmar turned into a beast, and Facebook reflects that. It's not some fringe ideology. Everyone from politicians to monks have voiced their support for this.


It's more likely that there was deliberate propagandization of the population and Facebook was one of the mechanisms (like radio was in the Rwanda genocide.)


> "This work includes a dedicated Safety Page for Myanmar, a locally illustrated version of our Community Standards, and regular training sessions for civil society and local community groups across the country." [1]

Obviously I don't speak Burmese, but given Facebook has been weaponized for hatred and ethnic cleansing, a one-page cartoon PDF seems more than a little inadequate.

[1] https://www.facebook.com/safety/resources/myanmar

[2] https://scontent-lht6-1.xx.fbcdn.net/v/t39.2365-6/15516483_3...


Social-news platforms like Facebook and YouTube are automated media machines, without the checks and balances of a people heavy news media company. In traditional news, employees maintain stronger consciences because they deliver the news manually day after day, they are not separated from that news by automation unlike the engineers working on social-news platforms.

Because of this, social-news platforms are more free to recommend stories to users that promote anything, as long as the user clicks it's a win. With this anything goes approach to news, we get sensational stories, conspiracy stories and hate stories because people click on them. Obviously the consumers are also responsible, but being perhaps accustomed to journalistic standards, maybe they are culturally unprepared for the level of bullshit-dressed-as-news that we are seeing online. I also notice the problem is compounded by the aggressive evolution of head-faking in media. For example it's becoming really hard to know what is real news vs what is marketing-dressed-as-news. How many commenters are real people? Basically I think algorithm-driven-news and marketing is outpacing traditional society and creating something new, time will tell what but it might be a monster...


I'm not sure I would paint traditional media companies, like say local news stations, as paragons of integrity, autonomy, and having a strong conscience.

Exhibit A: https://youtu.be/hWLjYJ4BzvI


Zuckerburg today:

"I remember, one Saturday morning, I got a phone call and we detected that people were trying to spread sensational messages through — it was Facebook Messenger in this case — to each side of the conflict, basically telling the Muslims, “Hey, there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place.” And then the same thing on the other side."


He seems to be interviewing with anyone that would take him to discuss these recent issues - except with Congressional and Parliamentary committees. That's when he decides it's best to send "the people in charge of those products" because those people are supposedly better informed on what happened.

So if we are to believe his excuse, it's not okay to mislead Congress, but it's fine to mislead the public with incorrect information.


This quote is from this Vox interview: https://www.vox.com/2018/4/2/17185052/mark-zuckerberg-facebo...

The really interesting follow-up question here, which didn't get asked, would've been, "who's doing that?"


I imagine the answer is relatively uninteresting, and consists of "many ordinary people from the respective ethnic groups who have Facebook accounts, who are in turn influenced by decades of ethnic tension, rumours, actual news and explicit messages from organised groups mostly within the country"

The only real question is whether Facebook is a more effective vector for such messages than offline rumour mills, and I'm not sure it necessarily is. (If anything I'd say social media has added an awareness the outside world has rather lost its admiration for Aung Sung Suu Kyi and been a lot more sympathetic to the Rohingya than the average Burmese person might expect, but I'm not convinced that's helping much either)


The quote seems to imply that it was a single group of people that were inflaming tensions in both other groups. That's a lot more nefarious than people with ethnic grudges.


It raises a few 'who' questions, besides 'who is sending these messages?' but also 'who receives that alert and makes the decision to block these things?' and 'who's in charge of the heuristics the system uses to identify these behaviors?'


That's wicked. Does FB know who is inciting this violence? Is it a bot swarm that's training to weaponize FB/social media?


IMO this is people being shitty, not facebook. I would probably title it "Facebook has revealed the beast in Myanmar".


The beast is always there in people if you look on a large enough scale.

People managed to spread hate even before Facebook, but Facebook and other new communication platforms have made it dramatically easy. The same way guns were a dramatic improvement over swords and arrows. Guns are regulated one way or the other all over the world.

We shouldn't single out Facebook here, as the same could have been achieved using any alternative like WhatsApp or Telegram.

The problem is, these platforms with their tremendous benefits have presented society with a new problem which society should acknowledge and fix somehow.


> We shouldn't single out Facebook here, as the same could have been achieved using any alternative like WhatsApp or Telegram.

Not sure how telegram works, but WA and FB (the social network) are very different. The effort required to spread a link/message further is much lower in FB. It's literally one click away to reach all your contacts. (Or group members) That's not what you do with WA.


I will argue that WhatsApp has a higher potential of spreading rumours than facebook.

1. You can easily broadcast a WhatsApp message to multiple groups. Most people I know are members of 3-4 groups.

2. WhatsApp messages are transmitted instantaneously, and you will surely receive them, unlike facebook where what you see in your timeline is not deterministic.

Morever, even if a friend of your sent you some link even without reading, it is nonetheless treated as a personal message from someone trusted.

Also, WhatsApp is always on, always checked by people, and atleast where I am, people use WhatsApp a lot more than facebook. It has truly replaced SMS.


I guess it depends on the style of usage. Nobody I know sends a "this is an interesting link I don't really expect anyone to read" type message on WA. But everybody on FB and Twitter does it.


Heck no. Seeing how some of my family use Whatsapp - blind forwarding of literally any message to everyone on their contact list - I'd say it's a couple of orders of magnitude worse. On FB you can report a post and have it taken down. No such thing on WA.


'In Myanmar today, Facebook is the internet'

http://foreignpolicy.com/2017/11/07/facebook-cant-cope-with-...


Facebook: Connecting people is good, even if we help some killers along the way.


Actually, I think this whole "connect people at all costs" meme is pushing the wrong narrative. Facebook's failure isn't that it connected too many people, it's that it only made certain kinds of connections, while failing to make others.

In other words, they've done the easy work of strengthening existing weak social graphs—and gotten rich doing so—without doing the hard work of building actual new connections between people.


Can anyone do that? That’s literally strong AI territory... They do offer forums, instant messaging, voice chat, video chat, as far as tech goes they offer everything that is available today.

But they can’t make me a buddy with Shaq, it’s not their job :)


> But they can’t make me a buddy with Shaq, it’s not their job :)

This is what Facebook has always failed at, but where MySpace succeeded. On MySpace, I became friends with and met some minor celebs; no such thing has ever happened on Facebook (except maybe in its very earliest days). Facebook has never been about new connections, or even reaching a new audience; it was designed start to finish as a way to broadcast to a network that you already have. Unless, of course, you're an advertiser.


> Can anyone do that? That’s literally strong AI territory...

Why wouldn't it be tractable with current AI? I'm assuming they have the brightest minds and some fairly hefty AI software constantly analyzing and running A/B experiments, the result being that the FB experience is gratifying, sticky or even addictive. Now, why not re-purpose some of those resources so that it becomes gratifying for people to expand out of their bubbles? The hardest thing about it is presumably that it doesn't contribute directly to the bottom line.


"Maybe it costs a life by exposing someone to bullies. Maybe it facilitates genocide."


I find it funny how everyone focuses on this line because out of context it sounds "evil", even though it's far from the focus or what's shady about the memo itself. That line just implies that "the tool" can be used for bad stuff, but then there's also:

"Maybe someone finds love. Maybe it even saves the life of someone on the brink of suicide."

And, just like postal mail, phones, SMS, newspapers, word of mouth, radio, etc. "the tool" can be (and is probably most often) used for good. What's surprising about this?

What's actually shady about the memo is not how the tool can be used because of course it can be used for both bad and good things just like any other tool (you can kill a man with a baby bottle if you know what you're doing).

What's shady is that it's claiming the ultimate goal, above anything else, is to put the tool in as many's people's hands as possible. And they'll do anything that's possible to do so, even questionable stuff. This memo is about their shady growth practices, not what the platform is used for.


Could say the same thing about iPhone encryption...


Why is this getting downvoted? Phones have been used as tools by bad actors and criminals too, that's a fact.


Andrew Bosworth has got ta go.


Would you disagree with that? It's basically the same thing we say about roads.


yes, and just like we would want people to design roads in ways that limit driver deaths, we should insist that people design facebook in ways that limit negative consequences inherent to social networks. the problem is that facebook (a) is opaque about these decisions and (b) has shown that it cares more about maximizing profit than it cares about safety and security.

the only way to change the infrastructure is to gain control over it. hard to do when power is entirely held by Facebook. Regulation is one way to exercise control, but I'd rather see some kind of mass boycott / unionization among social network users to force Facebook to be more transparent and secure about user data (for what it's worth, I've tried to set something up here [1] and here [2]).

[1] https://medium.com/@oddbert2000/call-for-a-facebook-users-st...

[2] https://www.facebook.com/pg/Internet-Users-Union-12831967051...


> just like we would want people to design roads in ways that limit driver deaths

This isn't a response. We don't design roads so they're less convenient for murderers than for other people.


About roads, and mobile phones, and the internet. Looking further afield, confidentiality in finance is generally considered a good thing but has enabled many of the world's worst dictators. The US government has installed or propped up many of those dictators in the name of principles that were (at least by some) considered good at the time. Some people think unlimited access to guns is a good thing, but there have been some debates recently about the consequences of that. Killers use many services. So do police. Blanket assertions about whether the provider of a given service is good or bad are worth a lot less than analyses of what they might do - or be allowed to do - that would have shifted the balance.


Roads are public, and everything around roads and transportation is highly regulated and engineered with actual PUBLIC HEALTH and SAFETY at its core.


Most roads are generally a public good, which are not used to generate profit to enrich a small number of people. It's also clear where they lead and how they work.

Facebook on the other hand have put themselves in a position where they make a profit my offering proprietary mediation (via content selection algorithms and censorship) on top of UGC, as an ad-revenue optimization engine. Once you're in the optimized-mediation business, you're faced with lots of choices, and many of them will pitch people's lives against your profits (not just ad-revenues, but how many people to hire, and where to invest them).


Who profits off the roads? Because that's what's at issue here.


People who own the roads and collect tolls?


There's no better example than this of technology's unintended consequences. Tech has two sides but we only like to focus on the good one. We don't like to face the hidden beast.

We all need to keep that in mind.


> Tech has two sides

I'd argue tech doesn't have two sides, but rather _we_ have two sides.

If you make it about "tech" it implies it's somehow external to us, inherent to tech, and independent of what we do, when in fact, tech is just a tool and since we're the ones using it the fact we can do good or bad implies tech will always be able to do good or bad.


I'm no fan of Facebook. I quit using it in September last year . But for this particular issue, inciting of hate, they are no more or less guilty than the 'traditional' tabloid press.

Look at the comment section of your local 'populist' newspaper (over here it is https://www.hln.be/) and see how hate-speech rules supreme and how sensationalist articles are milking for those comments and like.


Few times I reported hate speech on Facebook, but every time it was rejected. It seems like hatred towards certain groups of people is allowed on Facebook.


Black Mirror isn't even prophetic at this point. It just feels like a documentary on real life.


As long as Facebook still aggressively and effectively policies its platforms for violations of its reactionary puritanical agenda, nothing they say about the difficulties of policing hate speech and other hateful propaganda is in any way credible.


All the technology in the world is not going to fix human nature...


Does anyone here actually know how it is possible to contact Facebook's legal department? Who else could someone contact to whistle-blow something?


Anyone want to comment the fact that facebook/twitter etc being blocked in a few countries? Are they smart? Did they foresee all this coming?


I honestly wonder how Mark and SS sleep at night? Can’t they afford to do better than this? Does their greed and shame know no bounds? I don’t get and it seems like it’s something that will bite them in the long run and open them up to competition, whereas there might not really have ever been a reason for a natural competitor to emerge if FB had just treated its eco-system with a bit more respect and stewardship. Why don’t they want Facebook to be like a curated garden, not a landfill?


This is too witch hunty, facebook didn't outwardly do anything to encourage this. By the same logic we can also blame the internet as a whole. There were similar messages on twitter and chat apps. Facebook is big, so it gets the attention, but it doesn't mean it had outsized influence per capita. We need to be careful putting full blame on the platforms that enable communication.


You might be interested in the concept of an attractive nuisance:

https://en.wikipedia.org/wiki/Attractive_nuisance_doctrine


From the wiki:

"landowner may be held liable for injuries to children trespassing on the land if the injury is caused by an object on the land that is likely to attract children"

This is precisely the point, facebook did not do anything special to attract these events. Quite they opposite they try to eliminate it, however successfully.


> facebook did not do anything special to attract these events

They built a platform monetarily incentivised to promote attention-grabbing content. They then it loose in a country just opening up to the outside world. That is reckless. Combined with their history of profiting from such violence, it looks downright evil.

Zuckerberg et al are our generation's analog to Goldman Sachs' bankers helping the oligarchs pillage post-Soviet Russia [1].

[1] https://www.thenation.com/article/harvard-boys-do-russia/


They built a messaging platform that could be used by litterally anyone to spread any message, and have done a pretty poor job of preventing its use by outside aggitators.

That's definitely a tetanus filled rusty playground if I ever heard of it.


Quoting the OP:

> By the same logic we can also blame the internet as a whole.

It may be trendy to hate on Facebook right now, but this kind of hand-wringing is ultimately an argument for internet censorship.


Exactly.

Which is what scares me. News lately have scared a lot of people into this censorship-supporting mentality and the idea of some sort of thought police.

To have a healthy public discourse we need to strive for is educating people so immoral acts don't happen, not implement mechanisms that could lend themselves to censorship.


They can also send those messages with phone calls, SMS, ordinary mail, some guy yelling in the street, or whatever else. Facebook doesn't seem particularly worse than the other alternatives.


> facebook did not do anything special to attract these events

An attractive nuisance is not something specially put there to attract a person or incident. It's something that attracts just by it's being there in its current state.


A deeply connected world is a more volatile world [1].

We're on the cusp of understanding that - if we're lucky.

[1] http://www.niallferguson.com/journalism/miscellany/why-twitt...


More like: it doesn't take much for people to turn into beasts.


Move fast. Break things.


It is wrong to blame a company for people's evil.


Facebook is not only bad for Democracy but it is also bad for pluralism. It needs to be regulated and deleted.


> Facebook is not only bad for Democracy

I'd argue TV and broadcast media is worse for democracy. Facebook at least allows two-way communication and some degree of customization, whereas TV it's just CNN, Fox News, etc telling you how to feel and what to think.


> Facebook is not only bad for Democracy but it is also bad for pluralism.

Given its current shape and policy, I agree. But...

> It needs to be regulated and deleted.

...that sounds a bit like "the detainee shall be beaten and then killed".


If its bad for Democracy and I believe in Democracy; I'm going to delete that shit.

And for those that do not delete it; it needs to be regulated.

Today its completely clear, Mark Zuckerberg lacks the willingness to self regulate and does not appreciate the damage his platform has done to our society.

FB isn't a detainee nor a victim.


https://www.bloomberg.com/news/articles/2018-04-02/missouri-...

"I influenced three senators for $477.85"

"The goal of the ad campaign was to convince people to call their Senate offices and tell them to vote No on a confirmation. I registered the domain dumpdevos.com anonymously, set up a Facebook page, and we were off."

Source:

https://medium.com/@colinsholes/i-influenced-three-senators-...


That bloomberg link is broken for me.



"Indeed, when I asked the company whether it would permit an external audit of its News Feed workflow and algorithms to prove that there are no hidden or inadvertent biases against stories critical of itself, a company spokesperson repeated its statement that it believed there were no biases, but did not respond to two separate requests asking whether it would permit an external audit to prove it.

...

Machine learning approaches are especially troubling, as the company continues to refuse to release any information about the functioning and accuracy of its models, even as they play an ever-greater role in shaping what two billion people can see and talk about in its walled garden.

Most recently, when asked about its efforts to train machine learning models to autonomously decide what is "fake news," the company responded that it was using a large number of signals (though it declined to elaborate on the full list of signals used) to train computerized models to fully autonomously scan what is being posted and discussed on Facebook and identify new stories the algorithms believe are false - all without any human intervention.

...

Despite controlling what nearly a quarter of the earth's population sees and says online in its walled garden, the company has survived nearly a decade and a half of privacy outcries without ever having to open up and give its users even the slightest insight into how they are being manipulated, moderated and commercialized.

...

Putting this all together, Facebook's utopian vision has devolved into a surveillance dystopia in which even its programmer creators can't be certain how or why it makes the decisions it does.

In the end, the telescreen's of Orwell's 1984 only surveilled the citizenry at random, while Facebook's unblinking algorithms never let us out of their sight, silently shaping what we are able to see and say without us having any right to understand the rules they quietly enforce, while even their engineer creators are not fully aware of the ramifications of the myriad inadvertent decisions that went into their programming."

Source:

https://www.forbes.com/sites/kalevleetaru/2018/04/02/faceboo...


"Facebook says "authenticity" is key to the social network and rigorously policed, and that false information violates the terms of service agreement.

...

Computer engineer Ryan Barrett fills in online forms with 0000s whenever a number is required and uses dashes for words. He says it is mostly out of principle: he wants to be in control of his information. Also, it's fun to try to fool the marketers. He has used a dozen different spellings for John Doe rather than entering his name. He even misspells his name when reserving airplane tickets and says it has never created a problem going through security.

...

He says he has friends who work at companies that look at multiple services to link up and cross-reference data on individuals-data gleaned from mobile phones, social media, grocery store loyalty cards and more. When those friends searched for him in their systems, they found little to no information. "There's a small feeling of satisfaction," he says.

...

All the lying does seem to foil advertisers. It is "a much bigger problem than people are aware of," says Nick Baker, director of research and consulting of U.K. market research company Verve, which conducted a 2015 survey showing a large amount of fake information on website registrations and the like.

Incorrect birth years, he says, are particularly nefarious because advertisers are often trying to match up habits or buying patterns with a specific age group.

...

Preethy Vaidyanathan, the chief product officer of New York-based marketing technology company Tapad, says they track much more valuable information from phone and web browser use.

Still, Ms. Vaidyanathan sees the value in hiding identity online. She says she uses a second email address with a fake name that she gives out to companies she doesn't want to bombard her inbox.

Source:

https://www.wsj.com/amp/articles/you-werent-born-in-1910-why...


"Facebook Allows Advertisers to Target Users on the Basis of Their Interest in Illegal Firearms"

"And it doesn't seem interested in closing this loophole any time soon."

https://slate.com/technology/2018/04/facebook-lets-advertise...


[flagged]


The Dalai Lama represents Vajrayana Buddhism, which has tended to be more peaceable. Myanmar and Sri Lanka - two Buddhist countries that have persecuted minorities within - are of the Theravada school. https://en.wikipedia.org/wiki/Southern,_Eastern_and_Northern...

Some Buddhist monks in these countries are the chief propagators of the hate - in Myanmar, the head of the 969 Movement that's driving the Islamaphobia is the monk Wirathu.

See https://en.wikipedia.org/wiki/969_Movement


Wow, that sounds like a Japanese transcription of "Wrath" - ヰラテュ.


But then, before the Chinese invasion and occupation of Tibet, lamaism was not what I would call a particularly kind and peaceful thing.


  how contradictory it is that these attacks are being carried out by proclaimed Buddhists
Most religious killing is highly contradictory to said religious texts..


[flagged]


None of the religious texts of the popular religions where a particular text is central have a strong, well-defined “internal logic”, just a variety of different interpretations of varying popularity.

(Also, the idea that a religion is inherently defined by a text, implicit in the upthread comment, is a false generalization.)


None of the religious texts of the popular religions where a particular text is central have a strong, well-defined “internal logic”

Let me clarify: if a logic can be found in a religious text, it will be used. Logic of violence, logic of profit, logic of slavery...all of these are illustrated by history.


Someone hasn't read the Old Testament. God mellowed out when he had a kid, but before that he smote entire cities. Sometimes God did it himself. Sometimes God ordered Israelites to commit genocide.


You're saying that Old Testament has "a strong, well-defined internal logic"? If so maybe you could do some more reading. I don't think e.g. Genesis and Job fit together at all... how about Deuteronomy and Ecclesiastes?


You've been here for ages; you know where threads like this go. Please don't pick religious flamewars here!


Fair.


Every major religion has been used for violence.


[flagged]


That is an utterly uncharitable interpretation of what's happened here.


We live in a world where knowledge of Islam by non Muslims can be detrimental to Muslims that only follow certain verses (beliefs) of their book (aka peaceful muslims), unfortunately for them the holy book is immutable (and also can be used to justify ISIS)

Also any form of attack on Muslims permits Muslims to retaliate, causing a vicious circle [1][2]

1. https://www.quran.com/9/36 2. https://www.quora.com/Where-does-it-say-in-the-Quran-to-kill...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: