If we don't identify and understand what is going on, we won't have an informed response.
Facebook shouldn't get credit for the Arab Spring. They didn't start it, they were just one of many mediums used for its organization and dissemination.
What they and all social media get credit for is existing as a communications platform that's largely uncensored.
Why does Facebook's algorithmic ranking of feed content get a free pass, here?
The FCC is insulated enough from partisan politics that its censorship doesn't run along partisan lines, and is therefore not very controversial. But TV is absolutely censored.
"Credit" confuses causal importance with (human) responsibility.
Normal Borlaug engineered a variety of wheat that fed a lot of people and saved a lot of lives. What deserves credit? The wheat or the inventor? Or Mendel et.al, for pioneering the principles of genetics that Borlaug used in his work? Even if "someone else would have done it if Borlaug didn't", he is still (one of) the most important human actors responsible for averting the crisis. The materials/products used are causally important; not humanly-responsible.
In this case, humans were responsible for the Arab Spring. For the most part, social media platforms were causally important to the process.
It gets much harder to draw that line, though, when the platforms are behaving in activist ways (e.g. $literally_any_technology_headline_in_the_last_week).
Is it still viewed as a positive event? Any countries where it lead to improvements instead of instability?
Imagine if the U.S. Postal Service could keep track of which pieces of mail get opened or read, and then they decided to only deliver the mail that made people the most angry. Sound like a good postal service?
There's a big difference between opening something up (many times needing a warrant, as your link says) because it may contain a weapon or biological agent or invasive seeds of a plant species, vs controlling what information is passed or not.
"A colleague acknowledged that the service had helped people in the country communicate with each other."
It leaves me dumbfounded that a journalist can write that sentence and not see anything wrong with the whole argument.
The printing press was a physical machine. How anyone can posit this as an argument with "half a brain" is beyond me. These are completely different scenarios.
Facebook can and has a long history of curating content. That immediately makes it different entirely from the printing press.
Comparing it to a newspaper is fair-it curates content. Sometimes it is biased, willingly or not.
It's a shame that on HN, whenever the industry has issues, it's immediately ignored and shuffled under conspiracy theories/government takeover/etc. There are valid concerns here-never-mind the fact that entities like Facebook and Google are essentially becoming governing bodies.
To work off of your post office analogy, consider if it were possible for random third parties to inquire of the post office who some particular person has corresponded with in the last several years and consider how that could be abused in a dysfunctional society where being friends with the wrong person could have very bad consequences.
I don't know if there's a solution to that, other than for Facebook to be much more restrictive about who can see a user's list of friends. However, that would interfere with the way people ordinarily use Facebook.
In the case of a government authorized genocide, however, it's likely the whistleblowers who would be silenced by government regulation of political speech.
> In 2015, researchers evaluating how Facebook Zero shapes information and communication technologies usage in the developing world found that 11% of Indonesians who said they used Facebook also said they did not use the Internet. 65% of Nigerians, and 61% of Indonesians agree with the statement that "Facebook is the Internet" compared with only 5% in the US. Source: https://en.wikipedia.org/wiki/Free_Basics#User_experience_re...
Facebook certainly isn't the main responsible, but it's an effective catalyst.
Consider that people in Myanmar have been isolated from the world for the better part of the last 40 years. They're effectively playing catchup with the rest of the planet. The country is already afflicted by all the symptoms you can attribute to dictatorship, tribalism, various conflicts, lack of information, lack of education, superstition, lack of infrastructure, etc. Take that as the base context and then introduce the population to technology (smartphones) and "the internet" (mostly via FB). Step back and watch bad things happen... much faster.
Regardless of the article not pointing all fingers to all culprits, what's concerning to me regarding specifically Facebook is their recklessness especially in some of these emerging countries. Maintaining peace in many of these places is already difficult and the various agreements are fragile, but FB in their pursuit for growth and profit has maintained an attitude of continuous and utter denial when it comes to their product's potential as an instrument of propaganda. That to me is to some extent almost insulting.
We all knew that when FB and Twitter started to centrally control content well beyond the obviously really bad stuff (ie, gore, child porn), that mandate would forever expand and expand, where it's almost impossible for FB to not be criticized for not doing enough ...Absent a massive expansion of content controls, which means massively expensive, which mean incentivizing a) simply limiting the ability for people to communicate on the platform AND/OR b) automation. Which ultimately means countless false-positives and examples of bias by machines trained by the most vocal special interest groups deciding what is okay and not okay to say to another person.
The future is going to feature some interesting trolling to see who can game algorithms to get topics banned on social media through phony media outrage campaigns, false reporting, and social engineering.
It's so good to hear that there are some responsible governments out there with respect to the dangers that Internet rumors can bring in volatile situations.
Isn't this a bit of a red herring, though?
First, "shifting blame toward Facebook" is not the only thing being done. People are finally allowing themselves to become somewhat aware of the toxicity and perils of data. Facebook is just the first (and one of the biggest). Many people (esp. on HN) already knew, but even among them, a lot preferred to believe it probably wouldn't happen, or just not think about it much (like climate change; faced with a problem of impossible magnitude).
Second, that it doesn't solve this particular problem (immediately), isn't a reason to dismiss placing blame. In fact, the act of placing blame doesn't solve problems by itself, usually.
However, putting a spotlight on the data hazards concerning social media may help solve some of the problems caused by pushing the corporations farming these networks to come up with better solutions. The effect on Myanmar will be small (but what do you expect? damage has already been done), but it's totally worth it for the combined effect of putting up a barrier, even if it merely lessens the odds that stuff like these large attack surfaces for propaganda-based exploits of the human psyche can happen anywhere in the world.
And it is currently happening in a lot more places than we are aware of (or perhaps even expect).
It has to be. I've seen technical presentations on security and privacy (from CCC, etc) speculating about these ideas since at least 15 years ago, "What if governments got a hold of all this data and used it for nefarious purposes?". What they didn't always predict was groups other than governments doing that, sometimes implied. Interestingly, I definitely can't remember predictions about it being offered as commercial services by shady analytics companies such as Cambridge Analytica and Palantir. Probably/maybe they're being covered by more recent talks, I haven't been keeping track as much. But it's accelerating the problem in a bit of an unexpected direction. Question, does anyone know, are those services not technically black market? (or partially)
 I don't want to single out "data sharing" as the big problematic thing either, there are lots of other dangerous things that can be done with data from user-tracking. In fact, just like radioactive waste, I'd argue it is still dangerous when at rest (or worse, you can't always know if no copies remain).
'Kalar are not welcome here because they are violent and they multiply like crazy, with so many wives and children,' he said.
Mr. Aye Swe admitted he had never met a Muslim before, adding, 'I have to thank Facebook because it is giving me the true information in Myanmar.'"
There are so many things wrong about this single sentence.
What makes you so sure you're the one getting so many things right at once?
Perhaps in a decade or so's time someone in Facebook will be called before an international criminal tribunal for what's happened there. Perhaps not. There's a bit of a case backlog.
The people who built the radio didn't get called before an international criminal tribunal; it was the broadcasters who did.
FB is both though. They built the system & tech (the ‘radio’), and via their algos/policies also control what gets spread and to what extent (the ‘broadcasting’).
It’s not at all clear cut.
Now, consider that it's common across South-East Asia to have very basic data plans that allow you to navigate Facebook for free (or at heavily discounted rates). https://www.mmtimes.com/business/technology/20685-facebook-f...
For a huge lot of these subscribers Facebook is the Internet.
People in the developed world who after gone through innumerable Internet phases and fads, and who presumably have more experience and a better sense of its subcultures, still have a hard time to distinguish fake from legit news.
How do you think a small towner in Myanmar who went from zero to Facebook overnight would fare?
>Radio Télévision Libre des Mille Collines (RTLMC), which broadcast racist propaganda, obscene jokes and music, becoming very popular throughout the country. One study finds that approximately 10% of the overall violence during the Rwandan genocide can be attributed to this new radio station.
The two founders and one announcer were sentenced to life, subsequently ~30 years in prison, the people who built this particular radio and the broadcasters together
People should not be allowed to communicate certain things which lead to harm to other people, no. I don't understand why this gets turned into "people not be allowed to communicate" (implying "at all"?).
If we think that Facebook should decide what's permissible, then that's US culture being the global nanny. If we let the local government decide what's permissible, then you better watch out because every single genocide in history has involved demonizing the victims as the real danger (so, their speech would be pointed at as "harmful" and banned, the perpetrators would be painted as organizing a just revolution). The worst genocides were conducted with state support, and wouldn't be stopped if local national governments were moderating Facebook. Forming a special panel to decide which groups deserve to get their message out seems like a very dangerous and poorly motivated concentration of power. I can't think of any other ways to decide which speech is dangerous and ban it, therefore I conclude we're stuck with not doing it.
Facebook shows you from the feeds of your "friends" whatever the company decided. Probably based on some algorithms, but it could be millions of cats rolling over the keyboards for it's not public what Facebook really does.
Third parties analyze and weaponize this behavior for many things from comercial interests, to justifying murder, to incitement to violence.
Facebook needs to be held responsible for its algorithms. Somehow. Stupid autonomous car makers will eventually be held accountable for their algorithms too. It's not like they can put some crap in their EULA and be done with it.
PS: Add to it that Facebook is the internet in some places. It's not just one of the websites.
Many people today wish we had a more open world, myself among them. Do you know we didn't used to even have passports? There was no 'schengen zone.' You could freely travel between nations as a human. But as a "temporary war measure" passports began to be required in WW1. And once governments had that new level of power and control, they of course not only did not relinquish it but expanded it magnitudes over. In the US income tax was another "temporary war measure." Government is not your enemy, but it is also not your friend. Beware of the implications of granting power to an authority who does not, or at some time will not, necessarily have your best interests in mind.
People should be allowed to communicate period.
People should not be allowed to harm other people.
A recent example of this are the ability for conspiracy theorists to communicate news stories after these mass shootings that these attacks were staged or the victims are "Fake". This further pushes harmful propaganda that there is not a real threat to students in schools, and the "third party" children are the ones who suffer from this communication.
For example, see every war ever. In any war, both sides believe they're right, but they can't both be right, so one or more is always wrong. They just can't see it because they can't process information safely.
And this is where you are lacking critical thinking. Single issue wars and issues with only two sides are rare enough that I'm not sure any have ever existed. Its definitely possible for both sides to be right (and wrong) and its definitely normal that educated people won't agree no matter how long they discuss an issue.
"I would be better off with that bloke's resources".
I'm not sold on the idea that wars are fought for ideals. There is a lot of coincidental wealth transfer when countries fight. Why else travel long distances to kill people you don't know?
It's not like there's a big man in the sky telling everyone that wars must only be fought between two opposing sides with identical motives on a logically comparable continuum.
It's not even like wars must even be based on positions that can be meaningfully called right or wrong. Or that most wars can be meaningfully interpreted as "two sides". Even within the allies of world war 2, you would have a variety of factions and motivations, as well as minority players with diametrically opposed views and beliefs.
And who is going to do that? Governments have no incentive to do that because lack of critical thinking in citizens is a major factor to ensure they they can get away with exploitation and prevent them from revolting..
Private companies try to offer primary education in some developing countries. Since the problem is not only lack of resources, but lack of infrastructure to use those resources (no trained teachers etc.) this is a plausible path forward. I have no statistics of the results. Is it just one more scheme to funnel aid money into investors pockets? I have no idea.
Teaching students critical thinking, means they are encouraged and taught to ask questions first, before obeying. Governments like a population that just would work obediently and pay taxes without questioning, than a populace that ask questions at every turn. So they would prevent the kind of education that can lead to such a population...
Well, all public education is based on the Prussian template so what would you expect :)
I think the tendency of the masses to follow the perceived leader is a facet of the human psychology rather than the result of a nefarious scheme.
From what I can tell of the education of my kids in Finland, the teachers really try to walk a tight rope between having small kids behave in an orderly manner as a group and not to stifle individual creativity and will to learn.
You need some authority and and respect of it in schools. Otherwise it would turn into a lord of the flies remake. I presume your critique is directed towards overbearing, paternalistic unnecessary harsh discipline.
I think the amount of this depends on the country you are discussing and and you need to be more specific. All primary education systems are not alike.
May be to an extent. But you cannot deny that this is exaggerated by social pressure and training from child hood. For example, in most school, the obedience is seen as a foremost quality of a good student.
Imagine the animals in the wild and how they can be tamed by training them in a specific way. You can see how a small man or woman is able to control a large animal such as an elephant or tiger in this fashion.
When you look at that, and when you see how, if the people put their minds, they can so easily revolt/react against the increasingly oppressive actions from governments, and how they don't, just like a tiger or lion can so easily refuse to obey their trainer, but they don't, only because of their training they have received from their birth, and the thought of not obeying just does not cross their minds..
You see, you have to stop looking at the people individually, and have to look the populace as a single entity. Like a reagent in a test tube. You pour in the obedience and compliance and you get more work and taxes out of it (and less unwanted reactions). You pour things like nationalism and patriotism into it, and you get lives to expend in the name of service of nation. You pour dreams of a better tomorrow, and you get votes...
I think all revolutions are started by tiny groups of inspired people, and the hoi polloi will follow them, or don't, based on the current general mood and the network effects they instigate.
That's also an inspiring thought. To make a change, you don't need to convince everyone. Just a tiny group will suffice, and if the dice fall correctly, everyone else will just follow along.
This, of course applies to both beneficial and pathological changes as well. The coarse grained group mind of the population is not familiar with ethics. That's why free speech and oppressing complete dick heads such as neonazis live in a delicate balance. A small group can cause harm to a society.
The funny thing here is that, since such a society won't lend itself to control easily, the governments does not want this. They want a population that can be controlled easily, but then as you said, they can also be controlled by a small group causing major harm, and the government's solution to that is regulating free speech.
So governments wins on both fronts and people lose on the same.
For example, immigration in Europe... some people want to make it easier others want to curb it completely. Who is right? There are arguments for both sides and your individual preferences and experiences determine which one you will find more convincing. There is no binary right or wrong over all cases.
I see plenty of similar sentiments in my facebook feed.
If you assume that there is an objective, external moral “right” defined in a very particular kind of way, this may be true, but that hidden premise is far from universally aceepted.
Who are you do "allow" or "disallow" free speech?
"Disallowing" free spech is actually a human rights violation. Isn't it? So what you're doing here is you're providing weasel moral grounds for human rights abuse.
The strangest thing is that there's a sizable number of women who support the law in its current form. Religion is difficult to comprehend.
Different localities have different cultures, and for some, it's good. Say for example, in my locality (Kerala), Muslim men have more than a wife only on certain conditions like:
1. The first wife can't bear a child and the person marries another one without avoiding the earlier wife
2. Marrying widows so as to protect them (there might be occasional cases of exploiting this too, though I'm not aware of any).
I do also have seen a finger countable cases where a man marries some women to threaten other wives. But so far, I remember only two such cases in my neighborhood, and none in my family. Usually such men get really less support from the community.
So women having such an experience shall be supporting the current law.
Men deciding what women want is a repugnant thing to do. To use your own examples:
1. If infertility is with the man, can the woman simulataneously keep two husbands?
2. Can a woman marry a widower to take care of him while being in a marriage?
Pretty sure this is blasphemous in Islam.
It's pretty much hard to get a woman married once divorced (pretty much every village in India at least, to say). But for males that isn't usually the case.
And considering Islam, it is men's duty to protect and feed his women, not the other way. I think this is true for all Abrahamic religions.
In Islam, this is even true when the women is rich. That is, the ownership of women's property won't change before or after marriage. The man have to provide food and shelter to his wife and children regardless of the women's wealth. Women is free from such duties. There are also other issues like inheritance (finding the true father of child), or considering the spouse equally (to the extend that a person with two wives should spent every alternate day which one wife and the other day with the other. This isn't possible when a woman marry two man, where there is pregnancy period, etc.)
And to answer your question: No. Islam doesn't allow that.
Women's rights are an absolute, irrespective of whichever religion people belong. If that means striking off what's regressive from the holy books, that's what liberal believers need to do. The sooner we get to Uniform Civil Code, the better.
(Same as most Abrahamic religions, since you mentioned those.)
And no, "it's our culture" is not an excuse for that.
Polygamy is common is all major religions in India. Also it can be argued in most Indian cases having a mistress is polygamy too.
You will also be a tad little surprised to know Muslims are the least polygamous community in India.
>>The strangest thing is that there's a sizable number of women who support the law in its current form.
Polygamy has/had its uses across history. Also if you don't mass murder girls like they do in India, you inevitably come up with use cases for things like these.
>>Religion is difficult to comprehend.
Its isn't. Religion is social evolution. That a few things don't apply to one person, doesn't mean they don't apply to billions of people.
For that matter other religions register under special marriage act in which Polygamy is forbidden too.
Sure, but how common is it really? Are you talking 3% of Muslim men? (in which case it's peculiar to regard as a particulalry Muslim thing)
And what, specifically do we object to about it? I'm against polygamy only in that (at least in situations I know more about such as the Fundamentalist Latter Day Saints movement, or FLDS) it seems very heavily entwined with abuse of power and/or underage marriage, and cults. If polygamy was not coupled with those things then I would struggle to identify on what basis (if any) it is objectionable.
I completely agree that polygamy by itself isn't objectionable. What's objectionable is allowing it only for men.
We just read about false information being spread about religions, and about Islam in particular. Let's be careful to back up what we say.
That and the prophet was actually monogamous for 25 years (until his first wife who was 15 years older than him passed away).
It's amazing how biased you are in this.
You unfortunately have a history of violating the HN guidelines in comments. We've asked you several times to stop doing this. If you can't or won't stop, we're going to ban you. Please (re-)read https://news.ycombinator.com/newsguidelines.html and use this site as intended.
My point is - HN is largely about tech. Not about bias against or for a community. So STOP making it about communities and religion!
What people use these tools for should be discussed on the context of the people. Discrimination didn't start because of Facebook and won't stop if/after it leaves.
This is a problem of moral education, and I personally don't want Facebook to be the moral compass of the world. This is something the government and society of Myanmar need to work against and figure out in sovereignty.
Are we suddenly implying someone in a first-world rich western country should interfere with the politics of an independent southeast asian nation again? Please.
They already have. By deploying Facebook. And especially Facebook's decision to subsidise internet data use for customer acquisition.
What do you need, a flashing neon sign reading “tech” to be interested? He’s directly pointing to one of the 5 big Tech firms, it could not be more relevent. Given that, what’s your problem?
You can argue a lot of things about this, but “not related to tech” isn’t one of them.
Facebook seems to be the most viral/effective platform for spreading any (fake or true) news, and in this instance, the news was possibly harmful to a community. If not facebook, the perps would have relied on the next most viral platform.
That still doesn't explain his cherry picking of comments that speak to bias more than tech. So, no - his comment is NOT relevant to tech!!
Including one of my more important figures in those questions surrounding chemical technology and nuclear physics, paraphrasing an old Buddhist proverb:
“To every man is given the key to the gates of heaven. The same key opens the gates of hell.
And so it is with science.”
I'm continually baffled, though less surprised lately, how many people still think that they can shirk all responsibility. It's important to take time to consider what effects your work can have on the world, and not think anything and everything justified in the name of making money. At some point, we're all in it together.
It’s all over this place, regrettably.
It's convenient to blame facebook, and ignore the goverment's actions. And it's convenient to ignore the fact that facebook is infrastructure, like roads and electricity. When bad people use roads and electricity, you do not cut it off for the rest of society, or blame the roads for bad things
No – instead, the government passes regulation to minimize the number of people that die on the roads. Your car needs to meet safety standards to drive on the road, it needs to be registered with the government, and you need to be licensed to drive it.
Things that become interwoven in the fabric of society don't automatically just get a free pass. I would say the opposite happens, actually.
As someone who spent a lot of time in Myanmar in the past 6 years, I saw the country go from near zero internet to 50-60% penetration in a couple of years. The strange thing is that most people don't really use the web - it's all Facebook. The internet, for a vast majority, means Facebook. The way news spreads in Myanmar is gossip and rumours - probably because of very limited press freedom in the past. This combined with some ugly undercurrents of nationalism, extreme poverty (for all ethnicities) in the conflict areas, and decades of poor education, has sometimes turned Facebook into an amplified medium of hate speech.
I don't think Facebook is to be blamed for the violence (it has existed, and still does, in many parts of Myanmar against other ethnic minorities - Karen, Kachin, etc - before Facebook, and without the same amount of press given to Rohingya) but it has most likely amplified the hate speech.
Traditional journalism/media always get a ton of criticism, but for all it's flaws they at least strived to have basic standards (fact checking, neutral language) and do their best try of objectivity. With the media circumvented this decade, the crazy grandpas of the world united.
Partial quote for your convenience: (read the full comment for sources and more).
> FB stopped being a "data sharing platform" when it muscled its way into becoming synonymous of "the internet" in much of the underdeveloped world. In many of these countries (including Myanmar) people can get a really cheap plan that offers next to nothing beyond "Facebook Free Basic", FB's walled garden (e.g for 3$/month you get 100mb of data + unlimited facebook). This is effectively a two-level Internet with FB being the obvious default.
However, now that anyone can publish information fast and to a large number of people, false facts will circulate fast on any communication platform.
Here in India in my village, people forward each other long false statements and rumours, and people believe it blindly and propaagte to their contacts. As a society, I don't think we know how to handle this.
Yeah, we've never had genocidal movements whose propaganda was spread, among other means, by in-person rallies or other such assemblies, precisely because “real world” gatherings have the kind of real time fact checking you discuss.
Imagine an in person rally of 10k people. It would be a huge event both and get huge coverage. Bigger rallies will have even more. So some local media or local government has a chance to counter the falsehoods and make people aware. Governments may even choose to not allow some gatherings in some parts of the world if they fear they will cause a significant law and order issue.
Also, offline gatherings depend on people propagating the issue. This would have previously required people brain washed or dedicated for the cause. Now even normal people can forward anything to all theie WhatsApp contacts and boom, the chain goes on.
In Myanmar, as in most other cases, the government was an active belligerent; they wouldn't have been a check in any case. Genocides are usually conducted by groups that are already dominant in government, media, and society in general, so the kind of social institutions you point to are usually subverted before rallies (physical or virtual) directed imminently at violence occur.
So, it's quite unlikely that those institutions will check in-person rallies; more likely they will be organizing them and spreading their message.
People have little ability to change A, but could exert influence on companies to combat B if sufficient pressure were exerted...
Hypocrisy is an untenable position. Watching people argue that social media is equivalent to rallies, on HN would be funny if it weren’t so desperately sad. You can make this comment disappear, but you know I’m right about this. Even a brief glance through comment histories and submission histories bears it out.
This is not true. When groups of people get together in public, they're free to state falsehoods. Big Brother-style government oversight is not the answer to the "problem" of that freedom.
However the new platforms allow everyone to create very large gatherings anytime. Both the size and number of these gatherings are very large, and so are their scopes. It becomes impossible for people to keep track of the content discussed in these and people then go by whatever they feel good to believe in. On a country level, this can cause huge changes, both good and bad.
The interests of said minority are typically un-represented in local newspapers. The papers are often the second to cheer-lead this sort of behaviour.
Today, stories bounce from fake news, to Russian propaganda, to user data mishandling, to fueling genocide in Myanmar.
Correct me if I'm wrong, but I thought that was Twitter as the major player RE: Arab Spring.
I think 8 years is a more than long enough time for backend changes in a platform to change sentiment. Also, the platform may not have been used for as many nefarious purposes in the past.
The rhetoric around this has shifted, not because Facebook has done anything different. Or because subsequent uprisings have been any different.
But those mini revolutions of the Arab Spring were not what we thought they were. "The People" taking power from the dictators, created the society they want, but it is LESS in line with liberal values than the previous one. This is what we failed to realize at the time.
Facebook has done an amazing job empowering the will of the people. Which we are still inclined to view as positive.
But when the majority of people WANT a government that is less tolerant of other religions than a dictator would be, what should we do? Should we blame Facebook? Or should we try to use Facebook to counter the hateful ideas that are being spread?
How trivial would it be to make sure this leader in the article who 'never met a Muslim', was forced to scroll through a feed with positive messages about Muslims and religious tolerance?
If Ghandi or Martin Luther King Jr. were alive today could they sway our elections with a billion dollar ad budget? We can't force any other group to live by our values. But we can promote what are values and ideas are, in a way that might be help them spread.
Maybe we should ask Facebook to help with this, instead of trying to take down the whole network just because some groups are using the platform in a way that we don't like.
This is tragically true. A revolution is a re-spin of the the roulette wheel when it comes to outcomes, the result is absolutely not guaranteed. In the case of Egypt, the semi-secular military dictatorship was overthrown and became .. an Islamic theocracy, which was actually worse for the average woman on non-devout Muslim, and the ensuing counter-revolution back to military dictatorship was reluctantly acquiesced to.
So when the users don't conform to the world view of the platform operators, the answer is platform sponsored propaganda? That is incredibly dystopian.
Advertising is propaganda dedicated to getting you to conform the advertiser's preferences, and it's already targeted however the advertiser prefers, which could be by lack of conformance to those preferences.
You are basically suggesting that the platform operator doing themselves what they already build their business around allowing anyone with a checkbook to do is dystopian.
I would suggest it is no more dystopian than the status quo.
Facebook sponsors propaganda all the time. Its algorithms favor certain posts and messages over others, based on a long list of attributes, including ad dollars. But apparently that's OK, while suggest that Facebook display some positive third-party post about Muslims in Myanmar -
while a genocide is underway - is suddenly "platform-sponsored propaganda"?
Of course this is much easier to rationalize in this situation. However that's how you end up giving away liberty. You give it up in little pieces in response to extreme situations, then when you go back about your life you don't get it back.
The real issue here is that this gets the whole problem backwards. We shouldn't be looking at a state-sponsored genocide and then claiming that "if only Facebook had more control over public discourse then we'd be able to solve this problem". The "problem" in that statement can be anything from this genocide taking place, to your preferred candidate losing an election. "We need more propaganda" isn't going to solve any of that, and in reality it's just a veiled power grab by companies that wish to control public discourse more effectively.
The genocide in Myanmar has both state-sponsored and spontaneous characteristics. Facebook can and should help with the latter, if it's going to allow folks to pass around pro-genocide messaging using the site.
In the end, I advocate for everyone to abandon all large-scale, centralized, corporate social media, but given how unrealistic this goal is at present, my next hope is for the large social media companies to assume more responsibility for their actions. The same standards that we've traditionally held all media companies to.
Again, this may be a more compelling example of how you might define "hate speech". But if you think facebook has a responsibility to start moderating speech based on its perceived "hatefulness", then you're still going to end up at the same destination, where genuine public discourse is gradually replaced with corporate approved discourse.
I think you are hearing something I didn't say.
I'm just talking about promoting some TED Talks. Making sure everyone can name at least 1 good Muslim. Maybe trying to raise the profile of the people who should be winning the Nobel Peace prize.
If someone is IN a hate group. I would like them to have some limited exposure to the views of the other side. I don't think that's propaganda.
I think 'content moderation' has a much higher potential for abuse, and a nightmare future.
>A political philosophy supporting the rights and power of the people in their struggle against the privileged elite.
The idea that the government is going to go after Facebook because it is used to fuel the rights and power of the people in their struggle against the privileged elite is very scary.
I disagree with your sense of unqualified populism implies authoritarian populism. In my experience it's used without qualifiers to describe both Trump and Bernie. I suppose whether you think those two are authoritarian probably depends a lot on your own political leanings. But generally it means "fight the power." Or in Trump's case, "drain the swamp ." Here's a NYT article that, having skimmed, seems reasonable to me.
: This comment is not intended to confer any positive feelings towards Trump or imply that he did anything other than make the swamp even swampier. It was only in reference to his rhetoric
I think both the Trump and Bernie movements have some frightening populist characteristics. But between the two leaders, only Trump has been broadcasting blatant authoritarian messaging.
That's not to say that Bernie would never engage in any authoritarian action if he were elected, but he's not made overt any authoritarian tendencies like Trump has.
From the wiki on Populism:
>Populism is most common in democratic nations and political scientist Cas Mudde wrote: "Many observers have noted that populism is inherent to representative democracy; after all, do populists not juxtapose 'the pure people' against 'the corrupt elite'?".
Sure, populist upheaval can lead to American Revolutions. But it can just as easily lead to pogroms, and one of those patterns is repeated far more often throughout history than the others.
The mantra of "move fast and break things" has driven us off a cliff, where our moral backbone is what has been broken.
"He recalled one incident where Facebook detected that people were trying to spread “sensational messages” through Facebook Messenger to incite violence on both sides of the conflict. He acknowledged that in such instances, it’s clear that people are using Facebook “to incite real-world harm.” But in this case, at least, the messages were detected and stopped from going through."
Notable: "through Facebook Messenger"
So one extrapolates that FB actively monitors private conversations carried on Messenger.
> Facebook's systems detected what was going on and stopped the messages from going through
Wwwwhaaat?! Some people may have just seen that message and interpreted it as "shit hit the fan, let's hide my family in a safe place until this cools down", even if it was intended as a "call to violence". Censoring messages like those could have just as well caused deaths because innocent people just didn't get the heads up.
Corporations should clearly define themselves as either "medium companies" and stay completely neutral to whatever flows through their platform as long as it not "explicit content" (yes, this include allowing "hate speech" as long as it's toned down, because that "hate speech" can also contain useful information, and it's not something clearly identifiable), or "message companies", in which case they can clearly take sides in conflicts, but also be responsible (legally) for their actions.
This muddy "middle ground" position that some companies take is "the root of all evil". Either let anything happen (including bad things), or pick a side, so that you can later be judged according to the side you picked. It's condescending to imagine that you're actually smart enough to "properly filter" information. You're not, or you're a tyrant imposing his value system on others.
I have more sympathy for a corporation that does evil deeds in the service of profit, than for one that interferes in "muddy" ways in social issues and prevents clarity and free flow of information. Sometimes this flow of information cause blood to be spilled, but sometimes problems get solved this way, if a society is not evolved enough to solve them in more peaceful ways. Toning down discussions and letting tensions accumulate is worse.
Facebook reaction? They found it doesn’t violate their community standards: http://const.me/tmp/fb-policy.jpg
Right, “no place for hate speech”.
I wonder what is the threshold for reports before an actual human looks at them, or what keywords will get Facebook's attention. Perhaps a bunch of people need to report simultaneously.
Hate speech is just not a priority in the way spam is.
If they did that with speech, then they would censor an insane amount of sarcastic speech as well. The false positive rate would be really high and that would be a chilling effect on speech. Similarly, the actual hate speech would find ways to use new coded terms and nuance to get around the filters.
Imagine if instead of "no female nipples" they had a policy of no overly sexual images. That would be way more subjective and harder to enforce. My bet is that is the case with text based speech.
"We also restrict some images of female breasts if they include the nipple, but our intent is to allow images that are shared for medical or health purposes. We also allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures, and other art that depicts nude figures. Restrictions on the display of sexual activity also apply to digitally created content unless the content is posted for educational, humorous, or satirical purposes"
On Twitter, for example, there are a host of accounts that have been banned for seemingly harmless activity while other accounts with much more serious issues remain active.
Despite it being easier to report violations on Twitter (taking your word for it...I have no personal experience), standards aren't being applied fairly. That undermines the whole point of standards.
If it was him, maybe he fell under some clause of notable persons. I don't know what the content was like of course, but that could be one reason they didn't immediately delete.
Did they eventually remove or did the creator shut it down?
Yes, he is.
> who writes about conspiracies
No, he doesn’t. He’s an investment banker, and he mostly blogs about the consequences of Putin’s external politics (i.e. the ongoing wars in Ukraine and Syria) for Russian economy.
> Did they eventually remove or did the creator shut it down?
I don’t know what happened to that group.
Germany even passed a (admittedly very bad) law, mostly because of FB’s unwillingness to take this problem seriously.
It's just as easy to suppress malevolent ideas with censorship as progressive ones. Handing that power to an elite few, however noble their intentions at first, is dangerous.
In this case though, it does appear they made a mistake.
No one is perfect, but I think Reddit is better.
At least they are transparent, i.e. unlike Facebook, Snapchat or Twitter they aren’t afraid to talk to journalists about their content filtering: https://www.newyorker.com/magazine/2018/03/19/reddit-and-the...
They even soften it with something along the lines of „you were right to report it, but...“
'Telegram has told Russian regulators that it is technically unable to hand the encryption keys to user accounts to the country’s secret services, just weeks after the messaging platform was ordered to do so or risk being banned in the country
Roskomnadzor, Russia’s communications watchdog, told the company last month that it had two weeks to give the FSB, successor to the KGB security agency, access to the company’s encrypted messages or face the possibility of being blocked'.
Iran (where Telegram have 40 m users) is on the verge of banning it too as it was 'used to organise mass protests last year'.
We seem to have gone from FB being an enabler and hero of 'arab spring' to now being accused of being a tool of darker forces against states. Telegram have raised over 2 billion USD (of probably dodgy money given terms of ICO)and may now be crippled by state interference...
Well, (1) you may have confused FB with Twitter in regard to the Arab Spring (both were involved, but Twitter seemed to get more credit at the time), (2) the Arab Spring itself was a set of anti-State uprisings, and (3) in Myanmar, the atrocities FB is accused of facilitating were committed by, rather than against, the State.
In Tunisia -- the spark that arguably started the Arab Spring -- Twitter use was (and still is) virtually non-existent when compared to Facebook use.
Twitter is huge in KSA and some of the Gulf, though.
These hatreds (if you will) are not new, at all. They have been "mismanaged" and/or conveniently exploited for as long as any of us can remember.
Certainly, the communications tool (aka FB) can play an enabling role. That is, none the less, a symptom; a symptom of a disease that predates the tool by eons.
The UN is confused and distracted; and it seems willing to let - once again - the true guilty parties off the hook. Yes, FB played a role. But to ignore the historic context is silly and dangerous.
The disease will persist. Because it can. Because it's easier to blame a symptom.
But the lack of explanation for how and why FB should be held accountable for history is a bit disappointing.
Obviously I don't speak Burmese, but given Facebook has been weaponized for hatred and ethnic cleansing, a one-page cartoon PDF seems more than a little inadequate.
Because of this, social-news platforms are more free to recommend stories to users that promote anything, as long as the user clicks it's a win. With this anything goes approach to news, we get sensational stories, conspiracy stories and hate stories because people click on them. Obviously the consumers are also responsible, but being perhaps accustomed to journalistic standards, maybe they are culturally unprepared for the level of bullshit-dressed-as-news that we are seeing online. I also notice the problem is compounded by the aggressive evolution of head-faking in media. For example it's becoming really hard to know what is real news vs what is marketing-dressed-as-news. How many commenters are real people? Basically I think algorithm-driven-news and marketing is outpacing traditional society and creating something new, time will tell what but it might be a monster...
Exhibit A: https://youtu.be/hWLjYJ4BzvI
"I remember, one Saturday morning, I got a phone call and we detected that people were trying to spread sensational messages through — it was Facebook Messenger in this case — to each side of the conflict, basically telling the Muslims, “Hey, there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place.” And then the same thing on the other side."
So if we are to believe his excuse, it's not okay to mislead Congress, but it's fine to mislead the public with incorrect information.
The really interesting follow-up question here, which didn't get asked, would've been, "who's doing that?"
The only real question is whether Facebook is a more effective vector for such messages than offline rumour mills, and I'm not sure it necessarily is. (If anything I'd say social media has added an awareness the outside world has rather lost its admiration for Aung Sung Suu Kyi and been a lot more sympathetic to the Rohingya than the average Burmese person might expect, but I'm not convinced that's helping much either)
People managed to spread hate even before Facebook, but Facebook and other new communication platforms have made it dramatically easy. The same way guns were a dramatic improvement over swords and arrows. Guns are regulated one way or the other all over the world.
We shouldn't single out Facebook here, as the same could have been achieved using any alternative like WhatsApp or Telegram.
The problem is, these platforms with their tremendous benefits have presented society with a new problem which society should acknowledge and fix somehow.
Not sure how telegram works, but WA and FB (the social network) are very different. The effort required to spread a link/message further is much lower in FB. It's literally one click away to reach all your contacts. (Or group members) That's not what you do with WA.
1. You can easily broadcast a WhatsApp message to multiple groups. Most people I know are members of 3-4 groups.
2. WhatsApp messages are transmitted instantaneously, and you will surely receive them, unlike facebook where what you see in your timeline is not deterministic.
Morever, even if a friend of your sent you some link even without reading, it is nonetheless treated as a personal message from someone trusted.
Also, WhatsApp is always on, always checked by people, and atleast where I am, people use WhatsApp a lot more than facebook. It has truly replaced SMS.
In other words, they've done the easy work of strengthening existing weak social graphs—and gotten rich doing so—without doing the hard work of building actual new connections between people.
But they can’t make me a buddy with Shaq, it’s not their job :)
This is what Facebook has always failed at, but where MySpace succeeded. On MySpace, I became friends with and met some minor celebs; no such thing has ever happened on Facebook (except maybe in its very earliest days). Facebook has never been about new connections, or even reaching a new audience; it was designed start to finish as a way to broadcast to a network that you already have. Unless, of course, you're an advertiser.
Why wouldn't it be tractable with current AI? I'm assuming they have the brightest minds and some fairly hefty AI software constantly analyzing and running A/B experiments, the result being that the FB experience is gratifying, sticky or even addictive. Now, why not re-purpose some of those resources so that it becomes gratifying for people to expand out of their bubbles? The hardest thing about it is presumably that it doesn't contribute directly to the bottom line.
"Maybe someone finds love. Maybe it even saves the life of someone on the brink of suicide."
And, just like postal mail, phones, SMS, newspapers, word of mouth, radio, etc. "the tool" can be (and is probably most often) used for good. What's surprising about this?
What's actually shady about the memo is not how the tool can be used because of course it can be used for both bad and good things just like any other tool (you can kill a man with a baby bottle if you know what you're doing).
What's shady is that it's claiming the ultimate goal, above anything else, is to put the tool in as many's people's hands as possible. And they'll do anything that's possible to do so, even questionable stuff. This memo is about their shady growth practices, not what the platform is used for.
the only way to change the infrastructure is to gain control over it. hard to do when power is entirely held by Facebook. Regulation is one way to exercise control, but I'd rather see some kind of mass boycott / unionization among social network users to force Facebook to be more transparent and secure about user data (for what it's worth, I've tried to set something up here  and here ).
This isn't a response. We don't design roads so they're less convenient for murderers than for other people.
Facebook on the other hand have put themselves in a position where they make a profit my offering proprietary mediation (via content selection algorithms and censorship) on top of UGC, as an ad-revenue optimization engine. Once you're in the optimized-mediation business, you're faced with lots of choices, and many of them will pitch people's lives against your profits (not just ad-revenues, but how many people to hire, and where to invest them).