Hacker News new | past | comments | ask | show | jobs | submit login
Where Countries Are Tinderboxes and Facebook Is a Match (nytimes.com)
81 points by dsr12 on Apr 21, 2018 | hide | past | favorite | 84 comments

There are two types of internet based attacks. The first is “cyberwar” or the kinetic attacks that cause real world destruction. This is the kind the US has always been fascinated with and excelled at.

The second kind is the more broad “cyberwarfare” which encompasses the political / psychological. The eastern countries (Russia, China, etc) have for a long time always been more deft at political and psychological offensives.

What if cyberkinetic attacks aren’t the best way to kill millions? What if simply enflaming tensions over Facebook, leading to revolutions and more ”Arab Springs” will kill tens of millions more?

Facebook could have been a better company, but they chose not to be, over and over again. They are a threat to peace and a catalyst for war and must be stopped.

The eastern countries (Russia, China, etc) have for a long time always been more deft at political and psychological offensives.

It seems implausible to me that the American/British "power pole" is anything like disarmed in the face of sinister Eastern Psiwar. The US invented public relations and has a vast advertising industry. A wide variety of public companies and private consultants exist mobilizing all the tenants of psychology and sociology for the purposes of influencing the view of the masses. American democracy has served as an excellent laboratory. And that's not even mentioning US secret organizations, who engaging in coups utilizing psychological operations in numerous countries.

Anyway, if Facebook can ignites all these powder-kegs, one might also ask where the powder came from. I'd say 30+ years of neoliberal policies have put a wide range of countries under considerable stress and second-string state authorities maintain themselves through inter-ethnic tension (ie, blame ethnicity X, not us).

And Britain, colonizing half the world, is no stranger to divide and conquer, which one would usually classify in psychological strategies. One might say, things are coming apart now, not because of those Eastern Psiwarriors but because the psychological tactics of the center and periphery can no longer hold back the problems of material degradation.

An interesting thing about FB is that the messages between users and the stuff shared there does not have to be actually sent by the users and don't have to be displayed the way the users wanted. Facebook can curate those or manufacture them from thin air and this is not the case only because FB is not acting maliciously(As far as I know).

If the US government wanted to utilize FB as a weapon, they can win a psychological warfare wherever FB is widely used because of the lack of post authenticity verification.

Apparently, FB did successful experiments that altered the user's mood by the way their timeline is curated[0]. So, the FB can conduct covert operations to alter the psychological state of a nation.

What's more, by looking at location info and communications on WhatsApp and Messenger and the psychological profiling FB can fabricate entirely fictional "snapshot" of the world. For example, FB can display gaslighting posts as if they are posted by the friends of the people while making sure that they don't directly communicate(post displayed from people that have not been contacted since a long time, when contacted on WhatsApp or Messenger reply with AI bot answers to confirm the post).

I'm very sceptical that FB will remain a global social network without addressing the huge political risks. I think that politicians will find a way to get rid of FB and have a local social network that they can control, the same way as it is in China.

[0]: https://www.theguardian.com/technology/2014/jun/29/facebook-...

What, specifically, is Facebook doing wrong today? How could they improve?

> What, specifically, is Facebook doing wrong today?

They rank content by the degree to which it captures users’ attentions and prompts sharing. This ranking incentive is fundamentally driven by their advertising business model, where attention from eyeballs and shares generates revenue.

Things that make us angry get shared more; numerous studies show this. An ad-driven social media service is this fundamentally driven to show us content that pisses us off.

> How could they improve?

By facing consequences for the violence they cause and profit from.

> By facing consequences for the violence they cause and profit from.

That's a way Facebook could be incentivized to improve, but the question was "how could they improve?".

I imagine in a lot of cases Facebook could correctly identify the sort of posts that people get angry enough about to lead to physical violence if it put anywhere near the level of effort into researching that problem that it does into maximizing its share of the user's attention. Such posts could then be given a low rank in Facebook's algorithm.

Do you think that they should make no attempt to rank content at all in the news feed? That is, the feed becomes just a dumb list of posts from friends ordered by creation date, descending?

Would that satisfy your concern?

What, specifically, is Facebook doing wrong today?

They make it easy to do bad things to a lot of people at once.

They make it easy to communicate to a lot of people at once. Whether those communications are good or bad is in the eye of the beholder.

Expecting our communication channels to sort out the good from the bad is how we get to Chinese levels of censorship.

It's also why you don't make it that easy. What if there was a device that could beam Facebook directly into a person's head, would you agree there should be some access control on that?

It's not just that they have a lot of influence and a lot of people on their platform, it's that they've shown zero interest in responsibility, especially if showing responsibility would reduce usage or advertising dollars.

No ads = no potential to manipulate people simply with enough money; I don't care for the arguments that this can have oversight to prevent this, there will be ways around it.

Ads provide some of the capability, but by no means all of it.

The engagement objectives of the ad seller (Facebook, here) means that highly-divisive or inflammetory content, which generates pageviews, or increases time-on-site, may be promoted over more socially beneficial material.

I think "no potential" is way over-selling the case. For these purposes, I think targeted astroturfing is probably more cost-effective than ads, and wouldn't be prevented by Facebook dropping ads altogether.

I enjoy the thought of an alternate universe where the most common social network is a p2p federated open-source system without ads, just running on alternative firmware in people's WiFi routers. However, that universe would still have astroturfing, and it would likely be harder to detect.

I don't understand why you think ads has anything to do with it. Much of the content on Facebook is user generated or at least posted by users from a third party source. Removing ads doesn't change that.

Ads is an easy and cheap method of adding fuel to the fire.

I also wasn't trying to argue that removing ads exclusively solves the problems.

In that case, "no potential" was poor phrasing. Ads aren't the only mechanism of social influence for sale on the open market to anyone with enough money.

Yes, you're right. I need to stop commenting for now, I'm not in a good place - not analyzing as deeply as I normally would or like to. Commenting is a distraction from the chronic pain I have, though it also at times will break my thought processes. It's a tough place to be. Some days are better than others.

I'm sorry to hear of your pain and hope you find some relief.

What if they do it to themselves? Proscriptive societies don't value absolute freedom and setting them up with democratic institutions leads to the most odious demagogues coming to power.

> The eastern countries (Russia, China, etc) have for a long time always been more deft at political and psychological offensives.

The existence and prevalence of Hollywood begs to differ. It is probably the most effective propaganda machine in the history of the world.

It might not be an overt operation but US-made movies are certainly supported by the government and are given special access to military equipment if the story portrays them in the right light.

I don't think "killing millions" is the point, nor a metric worth measure. You want to destabilize your enemy and civil war is like the ultimate form of that. On that spectrum, I believe, is also simply polarizing nations. Making a red and a blue and setting them against each other. I think the U.S. is a fascinating example of that, and it seems that Russia has done a pretty competent job fanning those flames.

You are assuming Facebook wasn't designed to do exactly what it's doing now. Operating as data gathering for the panopticon and precise propoganda via social graph.

Who is to say that Facebook isn’t part of the United States broad scope cyberwarfare effort?

Given what the government needed to do in the early 2000s, and the runaway success in preventing large scale terror attacks, it seems unlikely that companies like Facebook would not be part of that.

> runaway success in preventing large scale terror attacks

Bit of a tiger-repelling rock argument, this. How can we count how many incidents didn't happen?

Or maybe the terrorist problem is extremely bad, but only within the Middle East? There has certainly been a problem of British jihadis joining factions in the Syria war.

Let’s say they are, who do they answer to? Hillary and team think they contributed to her losing the election, FB, at least publicly seems to favor leftist/progressive ideas, so it’s not clear to me who they answer to, unless you’re suggesting some weird “deep state” conspiracy track.

Who does any defense contractor answer to?

Look at Tor. It was created for some military purpose. Who controls it? Nobody really, it’s out of control.

It’s not about some conspiracy, it’s a power tool without adequate safety measures, in the control of people who don’t understand or care to understand the strategic impact on society.

The eastern countries (Russia, China, etc) have for a long time always been more deft at political and psychological offensives

Russia knows perfectly well that Western psyops cost them their empire, the old USSR. For them, this is a grudge match.

Russia might believe that it was the evil Western psy-ops, but it doesn't make it so. After all, they believe in the Dulles' Plan, too.

but it doesn't make it so

That's neither here nor there, because they believe it and are acting according to that belief. All those decades the West was beaming its propaganda into their territory by the media of the time, now it's payback time.

There's a certain difference between acting upon rational and irrational beliefs.

And all those decades, while Western broadcasts were being furiously jammed by the USSR, Soviets were broadcasting just as much on their own propaganda channels. Which weren't being jammed either.

Zuckerberg is a smart guy and seems to care about policy. I wonder why he's so oblivious to the amount of harm he's causing.

He's a smart guy when it comes to technology, but with respect to physiological, emotional human behavior related to habits and persuasion, he may have a mental blind-spot.

I think a very inconvenient and ugly truth is slowly dawning on Facebook and society in general: connecting people at mass is bad. Facebook may go down in history as the next Big Tobacco or Big Fast Food that touted new innovation as a mass utopian relief, only to be later debunked as charlatan science

Zuckerberg has always touted Facebook's core mission as "connecting people throughout the world." The mission is so fervent you get socially deranged executives like Boz who boast about "connecting people at all cost... even if someone gets killed or bullied" [paraphrased here]

Technologies change throughout the ages, but human societies in general don't. Like it or not, part of what makes a society work is that the people you interact with have some similarity with you culturally and that interactions have costs - in the form of reputation.

If you lie everyday, you'll be known as the liar. If you are prone to anger attacks, people don't want to be near you. If you set a fire in your village, you get condemned or thrown in jail. The threat of social stigmatization helps regulate bad behaviors.

Facebook completely strips away all of that. It forces human beings to interact in an unnatural and inhumane way, which is that almost anyone in the world is allowed to attack your conscience and upset you. Your immediate circle is suppose to be small and protective, whereas Facebook opens you up to attack on all sides, putting you at the whim of whatever the public is feeling. You're assaulted with ideas and emotions you cannot possibly process fast enough as a human being.

It's situations like these where a fair amount of liberal arts training can help engineers like Zuckerberg understand that human beings are not statistical models. Your data set may suggest that the more "connections" people make, the more satisfied they feel. Just like mice who find out that pressing the lever and getting an injection of chemical feels very very good. It completely ignores what it means fundamentally to form a coherent society.

I think a very inconvenient and ugly truth is slowly dawning on Facebook and society in general: connecting people at mass is bad. Facebook may go down in history as the next Big Tobacco or Big Fast Food that touted new innovation as a mass utopian relief, only to be later debunked as charlatan science.

Wow, it is amazing to see how many people have come to this incredibly pessimistic view (I say this from other hn posts as well as this one).

It's been a long time since people lived in villages where each person's propensity for lying or not could be easily discerned. And it indeed took a while for people to reach the point that they could relate to strangers without either attacking them or allowing them to take advantage of them.

This progression has been necessary for the civilization that we know (for all it's goods and ills).

It's worth noting social media arose in the US at the point that a lot of immediate associations were decaying (see Robert Putnam's Bowling Alone). Social media has fullfilled a lot of the functions of things like the bowling leagues whose decline Putnam documents. Certainly social media has been problematic but it has essentially been where the socialize part of society is going (not to mention a lot the problems that the article and the parent imagines comes from social media in particular are actually problems of society in general). In anything, of the social medias, Facebook has done the most work in creating an interface to your friends in particular rather than the world at large (does the parent know he can keep non-friends from seeing or commenting on his feed? It might solve some of the problems he rails about.)

The thing is I suspect that the today's Facebook haters don't share an urge to return to the 100-person band societies that the primitivists I once knew idolize. Instead, I think they imagine a society where socializing will simply generally stop and people will give up the idea of friends entirely or that some small bubble they're can be exception to this.

>Wow, it is amazing to see how many people's has come to this incredibly pessimistic view.

It's amazing to see how many people ostensibly a part of "hacker culture" come to this view.

What's gone wrong here is not that "connecting people at mass is bad," but that those connections were supposed to subvert and tear down the systems of centralized authority and information control set up by governments and the media... but we've just replaced one set of suits with another, and now we pretend the internet only has a few channels the way television used to.

I'm all for ending the quasi-monopoly that Facebook and other big sites have on the public's awareness, but please let's stop implying that the problem with the web is its power to connect people.

> those connections were supposed to subvert and tear down the systems of centralized authority and information control set up by governments and the media

...even when those systems would help. The problem isn't just that we've "replaced one set of suits with another", but that this new set doesn't care enough to actually use their power. Throughout this Sri Lanka saga, for example, Facebook didn't have either an office in the country or any Sinhalese-speaking moderators, and didn't respond to the government's appeals until, a week or two into the violence, they blocked Facebook and suddenly the company realized its precious market share was in danger.

You get black guy and a white guy who have to live with each other may realize they have more in common than different, but if you connect a bunch of people over the internet the like-minded coalesce create feedback loops that further isolate them from outside groups.

> I think a very inconvenient and ugly truth is slowly dawning on Facebook and society in general: connecting people at mass is bad.

For those who aren't aware, this insight came from a Facebook VP: https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-face...

> connecting people at mass is bad

I would have thought that the problem is elites lying and being extraordinarily selfish and working for causes that don't like being exposed to the public and not infrequently only for the benefit of a small part of the population, so that people don't trust more official channels of information in the first place.

I’m not sure your message was clear enough. You’re adressing the example of CNN twisting reality so much to serve Democrats that most of the population now distrusts the media and trusts peer-to-peer information networks, I believe. With the compound problem that the comment above yours blames it all on the people. Yes, people want a narration of reality that matches their direct observations. Like, how can group be mediatized as uniformly good if they repeatedly perform bombings, looting and if they say they want to perform more bombing. Facebook comes to the rescue because media doesn’t inform people about real things. Yep, cognitive dissonance.

> Like, how can group be mediatized as uniformly good if they repeatedly perform bombings, looting and if they say they want to perform more bombing

What group?

I don't think the idea of connecting people at mass is bad, large communities can and have worked out fine.

What I think is the issue is that how Facebook and similar sites do things is flawed. Why? Because unlike real life:

1. You only have one profile on said sites (in most cases), and that means people can easily connect your different interests together.

2. Groups who can't tolerate each other's existence are forced in a situation where they've got to interact on a regular basis, or where interactions are made fairly common.

3. And everything you do is public record forever.

Those in turn are the issues that are causing all this social instability. Why? Well, think of it like this.

For the former, people always present a different 'face' based on the group they're with and the situation they're in. Their family knows one side of them, their love interests know another site, their friends know another, their colleagues yet another and various authorities know different ones entirely.

But social media services remove this. They give people one profile for everything, and expose groups who didn't want/need/have to know about some aspect of an individual to things they weren't really meant to know about. In the old days, an 'offensive' joke might reach 5 people who'd groan a bit and then forget about it. Now it reaches half the internet, leads to jobs being lost and causes flare ups between friends and families.

They also force people together who can't be 'openly' tolerant of each other, and makes their existence hard to ignore for their enemies, Yes, lots of people in real life have 'abhorrent' views, but said views don't travel beyond people that agree with said views. Social media sites push said views in everyone's faces.

What Facebook and co have caused isn't necessarily polarisation or what not. They haven't made views 'more' extreme.

They've revealed the tensions underneath human society that were previously able to be ignored because people could get along and work together in ignorant bliss. They've made every 'bad' and 'offensive' viewpoint held by some group or another visible to those outside the groups they're intended for.

Connecting people isn't the problem. Making visible things that shouldn't be visible to everyone and making every group and viewpoint too visible is.

> They also force people together who can't be 'openly' tolerant of each other, and makes their existence hard to ignore for their enemies, Yes, lots of people in real life have 'abhorrent' views, but said views don't travel beyond people that agree with said views. Social media sites push said views in everyone's faces.

Actually, if you read through this story, that is the exact opposite of the truth. Facebook connected only people who held the same abhorrent views - aka Sinhala-speaking people primed to believe conspiracy theories about the Tamil Muslim minority. They were telling stories about groups they couldn't tolerate, not actually interacting with them.

As tempting as it is to believe this story's premise that Facebook is bullying?/neglecting? smaller countries and then fit that into the larger backlash against social media, I think it's very misleading to believe that actually applies in Sri Lanka's case.

In fact, Sri Lanka has a history of majoritarian ethnic violence, up to and including genocide, against ethnic nationalities and minorities within the country.

Sinhalese also committed organized violence against Muslims in 1915. There were riots in 1958 because Tamils protested the use of the Sinhalese on license plates (following 1956 when Sinhalese replaced English as the only official language of govt, disenfrachising all Tamil govt employees). Tamils were killed in 1974 at the international Tamil conference in Jaffna, the Tamil cultural capital. And of course, the official war between 1983 - 2009 was one drawn-out genocide against Tamils by the 99% Sinhalese military and 95% Sinhalese police force, "ignited" by Sinhalese "mobs" using govt voter lists to burn Tamil homes and business along with killing Tamils, while the police were deployed everywhere but stood by and watched.

I get the tech-relevance of Facebook here, but this story is trying to take the latest ethnic atrocity from country that's already systemically racist, and somehow shoehorn it into the larger narrative of social media, corruption, and politics.

Maybe this is Sri Lanka's attempt at distancing itself from the Cambridge Analytica exposé that Channel 4 did undercover, where they posed as a middleman working to swing the recent SL local elections in the favor of an opposition chauvinist strongman who oversaw the crescendo of genocide against Tamils in 2009 (https://www.channel4.com/news/exposed-undercover-secrets-of-...)

I find the Sri Lankan government a very unsympathetic, persistent, complicit actor in the violence within its borders. It's not like Facebook and Cambridge Analytica were around to cause all the racist violence since 1915.

It's quotes like this that deflect responsiblity:

>“We don’t completely blame Facebook,” said Harindra Dissanayake, a presidential adviser in Sri Lanka. “The germs are ours, but Facebook is the wind, you know?”

And this, too, which says lynching is a natural individual reaction to systemic failure of law and order:

>And where people do not feel they can rely on the police or courts to keep them safe, research shows, panic over a perceived threat can lead some to take matters into their own hands — to lynch.

There is a new, democratic government in the country, which seems to have worked very hard to tamp down the tensions (even when that meant working against majority views).

Calling yourself a democracy doesn't excuse persistent, depraved, inhumane behavior like this: http://tamilguardian.com/content/sri-lankan-officers-involve...

People have always hated each other and unsubstantiated rumors/gossip/fake news have always been around too. Something about the volume of information and the speed at which it propagates makes things like twitter and facebook extra dangerous, but it's happening on whatsapp and surely text/email too. It just happens faster digitally and the pub/sub model seems to be especially bad for inflamed tensions.

I don't think we should blame the medium through which people share these things, but I hope we get to understand the dynamics and psychology of why digital pub/sub makes things so much worse than IRL 1-to-1 communication.

First, moderation is an after the fact reaction. That is insufficient. Facebook needs to take more responsibility for educating people about pitfalls of the internet as well as informing them of best practices on social media in the countries it is essentially invading.

Second, I'm appalled that these moderating positions remain unfulfilled on the excuse that they are trying to fill them in a physical location where few people speak the language in question. Facebook needs to immediately make such positions open to remote workers with the requisite language skills. It needs to immediately develop a program for remote moderation where any language specific roles can be filled wherever they can find native speakers.

Third, the rest of the world needs to figure out how to fight fire with fire. The rest of the world needs to start aggressively developing best practices for using Facebook as the solution to the problem it is creating in such places.

I have no idea what that would look like. But if Facebook is the way information is spreading, then Facebook is one of the ways antidotes to these deadly viral events should also be spread.

"In Sri Lanka, it keeps families in touch even as many work abroad."

"It provides for unprecedented open expression and access to information."

"But where institutions are weak or undeveloped, Facebook's newsfeed can inadvertently amplify dangerous tendencies. Designed to maximize user time on site, it promotes whatever wins the most attention. Posts that tap into negative, primal emotions like anger or fear, studies have found, produce the highest engagement, and so proliferate."

What if there were applications or websites that allowed families to keep in touch even as many work abroad. [There are.]

What if there was a computer network that provided access to information. [There is.]

What if the computer network provided for open expression. [It does.]

What if the problem is none of the above.

What if the problem is "Facebook newsfeed" that serves no justifiable purpose other than to "to maximize user time on [a single web]site" where family member behaviour is recorded and family members are easily targeted by advertisers who pay the owner of the website for this "service".

Who conducted the studies re: primal emotions.

Has Facebook conducted such studies.

Do families need a "Facebook newsfeed" to keep in touch even as many work abroad?

Does Facebook need a "newsfeed" and other tactics to keep family members on Mr. Zuckerberg's website longer than they need to be in order to keep in touch, so they will view/click more ads and generate more behavioural data.

Are the needs of families and the needs of Facebook "aligned".

> Has Facebook conducted such studies.

I strongly doubt it. I suspect they have a system that measures "engagement" with individual posts and promotes those that score highly, and it just so happens that those tend to be hateful or tribal ones.

I guess I just assumed that Arab Spring back in 2010 was the US using social media. But not really very useful to chat conspiracy any more, although that’s not stopping NYTimes.

Also funny that they are singling our Facebook as the match instead of the other social media players. You can likely wreak havoc with Twitter and Google ads as well. Lots of ways a bad actor can misinform.

> Also funny that they are singling our Facebook as the match instead of the other social media players. You can likely wreak havoc with Twitter and Google ads as well.

Those services don't have the kind of reach in Sri Lanka that Facebook does.

Google has 98% of search in Sri Lanka (http://gs.statcounter.com/search-engine-market-share/all/sri...). But Facebook has massive usage over Twitter.

No one was buying ads here - they were just sharing content for free, and amplifying each other.

At this point, what's clear to me is that it is criminally irresponsible for Facebook to continue operating in countries where they have no local presence, no active communications with local authorities, and no ability to effectively moderate content. Their passivity is literally killing people.

The difficulty is that in many cases it's the local authorities who are the ones spreading the inflammatory content and their requests for moderation are actually requests for censorship.

And if that's the case, the most responsible thing Facebook can do is to get out.

By that logic, shouldn’t any company whose services could be politically consequential get out? Then doesn’t that deny the population the tools they could use to influence their country?

They should stay in the market only if they're willing to put in the effort to make sure their political effects are neutral or positive. Facebook clearly isn't willing to do that.

First, do no harm.

But is it only facebook the ? should twitter, linkedin, google plus etc all pull ou since they can be used for the same purpose?

They can be, but in practice Facebook's UX, market position, and algorithms make its political effects much stronger.

That may be the case for Burma, but in this case the government was trying to halt said inflammatory content.

Facebook policy, however, doesn't care about the differences; it just wants to do whatever requires the least effort while maintaining high market share.

But internet should be free for all, how you use it it is up to you.

Shutting down communication channels will not stop violence.

That first sentence is an absolute moral position without any real justification, and the second is factually questionable.

Shutting down certain communications channels, or removing specific messages from them, can absolutely change behavior. Just like social media can drive a revolution, it can drive any collective action, including negative ones.

True, so do we start regulating 4chan, tor and other means of certain communications that can leas to negative effects? Twich cos you could get swatted? I mean, all channels will usually have positives and negatives and it always comes down to who will be the arbiter of what is positive or negative. I am afraid of those people cos i used to live in communism and they are the weak link in resisting censorship...

Also in the Philippines: "How Duterte Turned Facebook Into a Weapon with a Little Help from Facebook"


The issue here is propaganda is no longer controlled by state-run television and radio.

Whether we like it or not, the mobile Internet inflection point has happened and there's no turning back.

Anyone with basic literacy skills and the economic means to distribute bad information can spread propaganda and incite others on millions of mobile devices instantly.

And if it wasn't distributed on Facebook, the misinformation would most certainly find it's way through other channels (Twitter, Viber, WhatsApp).

Because Facebook doesn't care enough about certain markets to actually moderate, it means propaganda is no longer controlled by anyone. This is, in practice, extremely dangerous.

This is reminding me a scary amount of the role of radio in the Rwandan genocide.

Absolutely. And it is weaponized to that end in numerous jurisdictions. Turns out that all the work done on sales, growth hacking, virality and so on deployed in the service of commerce is even more effective in the political domain which is completely emotion-driven.

A fascinating example of an ongoing campaign is the "Qanon" conspiracy theory which has been running since late last year. The mysterious 'Q', who supposedly holds a top-level security clearance within the executive branch of the US government, issues (via the 8chan imageboard) frequent brief and cryptic clues, exhortations, and warnings about the 'gathering storm;, postulating that most political news is misleading and that the administration is secretly investigating a vast conspiracy of pedophilia that reaches to The Highest Levels of society. If you remember the 'Comet ping-pong' CT where a guy went to a DC Pizzeria and fired shots into a ceiling attempting to free abused children imprisoned in the basement (the building didn't have a basement), think that but on a national scale and with all sorts of other, darker implications. There's a large and extremely active community devoted to interpretation of these clues and crowdsourced information gathering on a grand scale.

The interesting thing about Qanon isn't Qanon itself. The account just spews out vague randomness continuously. It could just be a markov chain generator connected to the text of news sites.

Its the first derivative of the feed that's scary. There's a whole bunch of "cloud-spotters" convinced that they are seeing patterns in the madness and recompiling bits of it into a conspiracy theory narrative. If all the downstream see is this derivative output, you end up with a collection of "Q is real!" believers.

Its just a fascinating emergent phenomenon, if not for its very real political consequences.

I don't know whether it's random or not. It could be a Markov chain generator, and much of the 'content' is just recycling obvious contemporary events to give it a veneer of veracity, but it does seem to drop some genuine leaks from time to time. A journalist friend of mine monitors the feed and the surrounding community and I'm sure glad that I don't have to keep up with it myself. I check in on it once every week or 10 days and frequently wish I hadn't.


"The likelihood is very high that this is occurring."

...uhhhh, where are you getting that? I mean you've made absolutely massive and bizarre claims here and made no attempt to substantiate them. The likelihood is high that this whole extremely specific and strange scenario is true and that none of us have any idea about it??? How do you figure?

The power of media is hugely underappreciated.

Marshall McLuhan, Noam Chomsky, Elizabeth Eisenstein, Walter Lippmann, and others, have been writing and warning of this for over a century.

...I want to argue that the parallels between the printing press era and today are sufficiently compelling to suggest:

Changes in the information age will be as dramatic as those in the Middle Ages in Europe. The printing press has been implicated in the Reformation, the Renaissance and the Scientific Revolution, all of which had profound effects on their eras; similarly profound changes may already be underway in the information age.

The future of the information age will be dominated by unintended consequences. The Protestant Reformation and the shift from an earth-centered to a sun-centered universe were unintended consequences in the printing press era. We are already seeing unintended consequences in the information age that are dominating intended ones and there are good reasons to expect more in the future. Thus, the technologists are unlikely to be accurate and the inventors may neither have their intended effects nor be the most important determinants of information age progress.

It will be decades before we see the full effects of the information age. The important effects of the printing press era were not seen clearly for more than 100 years. While things happen more quickly these days, it could be decades before the winners and losers of the information age are apparent. Even today, significant (and permanent) cultural change does not happen quickly....

Renato Rosaldo, The Cultural Impact of the Printed Word, A Review Article, Comparative Studies in Society and History, 1981, Vol. 23, No. 3. p. 508.


It’s also something to consider from a business perspective. Every first world country locked down broadcast media with licensing. I suspect the same thing is coming for the internet.

And the role of the movable-type printing press in the religious violence that raged across Europe in the 1600s

Move fast and break civilizations.

I cannot read the whole article due to the paywall, but from what I infer the article is trying to tell that Facebook is not doing enough to suppress incendiary content. But isn't that asking Facebook to be a censor and arbiter of suitability of content. It may well seem to people and the mainstream media from Western democracies that Facebook is acting irresponsibly, but out here in the third world where our freedoms and liberties are not so strongly enshrined, Facebook and their ilk are valuable channels to speaknout against oppression by governments. I fear what measures Western people and Govts force upon Facebook etc on grounds of greater responsibility for content, will effectively grant more power to repressive governments to censor legitimate dissention.

Moreover the Western MSM seems to think that Facebook and other social networks should bear greater responsibility since people seems to be getting more and more of their news through the social networks. In that regard, perhaps it is the media houses who is trying to protected their vested interest, who want to perpetuate their role as gatekeepers of information. I think the time for gate keepers is past, they should be evolving towards a model where their focus is not news gathering and reporting, rather it is establishing/publishing trust measures for news items circulaing in social networks/internet.

In this analogy would the internet be oxygen?

There's actually a great analogy in the article:

> “We don’t completely blame Facebook,” said Harindra Dissanayake, a presidential adviser in Sri Lanka. “The germs are ours, but Facebook is the wind, you know?”

It's more a vector carrier. A set of fuses leading to powderkegs.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact