> Neal Stephenson: I ended up having a pretty dark view of it, as you can kind of tell from the book. I saw someone recently describe social media in its current state as a doomsday machine, and I think that's not far off. We've turned over our perception of what's real to algorithmically driven systems that are designed not to have humans in the loop, because if humans are in the loop they're not scalable and if they're not scalable they can't make tons and tons of money.
How is this any different from turning our perception of what's real over to mechanical turk systems like newspapers and TV news?
This line about perception of reality somehow being different today than it was, say, in the 1980s keeps coming up. The main shift is that the number of (dis)information sources has risen sharply and they've become much more efficient. Secondarily, entire media channels (newspapers and TV) have lost relevance to all but the older generation. Increasingly, those news outlets just track social media, which may be the oddest development of all.
In this context, the effective discontinuation of the White House press conference was inevitable. For better or worse, the president talks directly, unfiltered, to everyone. That's the real doomsday machine if put into the wrong hands.
Accountability. Someone is ultimately responsible for every newspaper article and every TV program. If a national newspaper decided to publish a piece of paid foreign propaganda, someone would have a lot of explaining to do, possibly in a court of law.
There's no such accountability for algorithmically-driven systems. It's not our fault that our platform keeps promoting propaganda, it's just a quirk of the algorithm; we don't really know why our algorithm is doing this and we're not going to let you review the source code.
I don't know what the possible solutions might be, but there's clearly a problem. Social media companies have largely abnegated responsibility for what goes on their platform and their platforms can't function without that abnegation of responsibility.
Because size matters, and in this case social media is a lot more pervasive (I'd say at least an order of magnitude more) compared to TV news of the 1980s or to newspapers. I knew of lots and lots of people who to didn't read newspapers 30 years ago, also the majority of the people around me didn't give that much thought on what was presented as "news" on TV because we knew they were mostly lies anyway (I grew in an Eastern-European country located on the wrong side of the Wall), but nowadays even the remotest shepherd has a cheap Android phone with an Internet connection which he mostly uses for Facebook and Whatsapp.
In other words social media is a quantitative change (billions of people are using it) which has transformed into a qualitative one (because billions of people are using social media things are different this time).
I think fundamentally the issue is that too much personal data is being collected, and put together, making profiles on people that can be used and abused. Social media is the first major area where that data is being employed and often being deployed irresponsibly (although I question the idea that so much personal information could ever be deployed responsibly).
Heck even without social media it seems like there was a progression in politics where things were becoming unhinged. Things were becoming more and more polarized and what is/was considered progress was more further and further from many people's ideas of progress. So there is a series of issues that are coalescing together in our time with a background in the century of self, identity politics, etc.
its definitely why politics now is incredibly toxic and i dont really understand why anyone contributes or even cares about political twitter dunking, its definitely not healthy
a community in chaos, full of micro tribes following their own lore used to be cause for war and now we have it, in extremely quick replies at all times anywhere
In the sense that having some hot oil spill on your hands while frying something is not the same as being submerged in burning oil as a medieval punishment.
Network and scale effects affect the impact of technological innovations. Even quantitative increase, over time, leads to different qualitative impact. But here we also have qualitative differences (e.g. TV networks didn't know shit about you, social networks know everything about your preferences, searches, friends, and so on).
Interestingly, that shift seems to have happened everywhere, not just in politics. Many companies have also realised that they don't need to go through the media/journalists any more, and started advertising directly to their customers/users/fans too.
Wonder if there's a term for that sort of social shift?
This is like saying why being scared of atom bombs when we already had swords.
> So that means that access to that kind of higher-quality view of the world becomes a class-based situation where people who've got the money to pay for or partially pay for human editors and curators are getting higher-quality info, which I think is just a slight kind of magnification or intensification of the way things are now anyway.
> …intensification of the way things are now anyway.
Now, here I disagree with Mr. Stephenson, these things were much more intense in prior times.
What he observes seems to be the development in the U.S. in terms of tonality of political discussions.
If you look at Asian countries as a counterexample, there is no such "intensification" happen because of SN at all. To the contrary, even the house of suckage, FB, is contributing positively, because a lot of people would not go online without FB or other easy ways. Operating keyword here is easy. For many people, who just made the jump into being alphabetised, or being the first in their family to be so, many SN outlets are a somewhat more comprehensible access point to info, then "Google Fu".
That's hardly an argument in favour of social media. The US is a democracy where people used to have civic debates around a shared reality.
That social media isn't going to ruin the debate climate in corporate-governmental societies where civilians don't have any input anyway doesn't really calm my nerves. In fact it frightens me. If the future of the internet is a cluster of entertainment bubbles with business and government bending reality to their will, and China is an example of the future, I'm not really keen on duplicating it. It's actually a very prescient observation: There seems to be a strong correlation between societies with high technological development but no democratic tradition, and the demise of traditional channels of media in favour of social media.
If anything, social media is worse in Asian countries, being used for massive scams, for genocides, for vigilante justice (which often targets completely innocent people), massive amounts of fake news, etc.
'We've turned over our perception of what's real to algorithmically driven systems ....' Like Nielson ratings?
We certainly weren't allowed to play a greater part then. Are we now, really? So, if once again we've allowed corporations to spoon-feed us, in return for watching advertising ... how's that any more of a doomsday than the first time around?
Stephenson's a remarkable story-teller but this comment is not helpful. If things look darker, that doesn't mean that they are. TV and Newspaper headlines were essentially algorithmically predictable. 'Darker' stuff wound up in magazines.
Today, anyone with the time can see it all ... unfiltered by news 'experts'. If it looks manipululated, it always was.
"The business of the journalists is to destroy the truth, to lie outright, to pervert, to vilify, to fawn at the feet of mammon, and to sell his country and his race for his daily bread." - John Swinton, 1880
Who are you to tell me my beliefs are wrong? I can be anything I want, I am qualified to be an expert on something because I want to be. The Earth is flat, who are you to tell me otherwise?
The lack of humility is the real doomsday machine.
Even if it's a miniscule amount, coupled with antivaxxer's, climate change deniers, etc. leads to pain points in society even having to prove them wrong takes a toll.
Groups whose soul purpose is to trash talk something are loathsome, even if the target deserves it.
Slashdot had some commercial awareness (e.g., there was a lot of predictable "take my money!" meme about products), but wasn't thinking along the most profitable lines, and maybe never could've. (I've been wondering how much/often dotcom founders of successful startups reshape the world in their image, and I doubt that Malda saw the world with a Zuckerberg eye, for example.)
Another big property that was bought by the same people was SourceForge. Which didn't fetch anywhere near the price of GitHub.
And it's still a dumpster fire.
If you want intelligent discussion, you go elsewhere. HN is pretty good (though with numerous glaring exceptions). Any part of Reddit that's not ruthlessly moderated (and/or still highly obscure) fares poorly. I've got small groups on Mastodon and Diaspora which work fairly well.
But clue flees idiocy and mindless conflict hard. And if your system breeds either, intelligence won't loiter long.
I am reluctant to put it here, but maybe just a taste?
[flame suit on :D ]
I disagree. Honestly, one can make tons of money, and incorporate the humans too. That can all scale, though the margins are not as good. So what? There is still tons of money to be made, just not the absolute maximum tons of money to be made.
The difference between those two is profound when it comes to the potential impact of social media on society today.
A while back, some friends and I did an experiment. What we did was set the expectation that people need to treat other people well, assume good intent, and in general, don't be a dick.
Anything else is a risk.
Being a dick comes with risk. It may be small, it may be significant. It's hard to know, by design.
See, when we define how shitty people can be to one another, they end up being exactly that shitty, plus a little.
Give people targets and they will meet them. Give them rules, and they will game them.
So then, make the only clear target real, human, discussion, mutual respect and consideration, and as time passes, the community will see more and more of that.
What does this have to do with the discussion we just started? Hang with me for a minute. I have to lay some foundation first.
Here is the money part:
Agency. When someone says something shitty to us, we are in control over how we respond. The number one response in most communities, is righteous indignation. The number two is to return that with equally shitty discussion.
And from there, it's a meta mess. They said, you said, he said...
Everyone wants "the platform" to handle this for them, to protect them, to enforce "the rules", etc... Ask people this, and they will go on at length about it all.
You can also give them scenarios, and they will also tell you indirectly, "the rules are great, but I need an exception, because other people are shitty to me."
This behavior dynamic is chronic. Rule based systems won't fix it. Machine learning won't either.
Now, what happens when people understand this is on them?
Someone calls us an asshole. They are strangers, we don't know them, they have established little to no credence. Or, they are trolls and are seen as such. Maybe they are clowns.
With agency in play, we have many options!
Score their insult (C+ Weak sauce, try harder)
On Quora, I will often apply this. Someone goes off, and I will say, "You should edit that away before someone reports it. Then we can try again."
Doing that is sometimes amazing. I've had some awesome exchanges with others that started out on a very bad basis.
We are in control of our end of the conversation. We manage whether it goes bad or not.
So much of the angst we see related to all this social media boils down to the fact that many people do not recognize their agency in dialog, and they expect the system to take care of them, relieve them of the burden of actually interacting with other people.
Now, where it comes to the real / unreal, there is a dynamic in play I find extremely disturbing, and this is the part I am reluctant to put here, so again, just a taste:
Our body politic is ill. Our mainstream media is not delivering a robust, real, inclusive dialog on policy. Tons of Americans have come to realize this, and it's a growing problem for the big money, establishment media.
A couple examples:
Where is the anti-war movement in the US, and what coverage does it get on mainstream media?
What percentage of news and opinion produced by mainstream media in the US do so from the labor, poverty point of view?
When is the last time any of us saw a reasonable debate on mainstream media about Net Neutrality, Copyright, and the basic problems and dynamics we are seeing where society, the law, and technology intersect?
There are many more. I just picked a few to make a point here.
All of those topics suffer from conflict of interest. The owners of mainstream, big media see those topics as threats to income, and coverage, editorial influence, all reflect that reality.
Now, ordinary people actually have an interest in those topics, and do discuss them. Where?
In addition, there are tons of great discussions going on, here, mailing lists, forums, Reddit, you name it. They are happening, and bubble up to platforms like Twitter and FB, which actually hosts a ton of "secret groups" all focused on specific topics, causes, and ways to action, activism.
(one does wonder how secret those are, but that is a discussion for another time)
Right along with that we have less informed dialog, conspiracy theories, crazy people, angry people, etc... And we've got some level of nefarious activity too. Many think "the russians" are producing content that is actually impacting elections, for example. I am not going to state truth or false on that, just recognize that sentiment is out there.
Every single day our mainstream media delivers "Fake News" The easiest to see are war related propaganda going out as news. It's constant.
Every single day, our mainstream media talks about all "those other people" on social media producing "Fake News" and what are we going to do about it?
See the trouble?
We can't resolve this "Fake News" on social media problem without also recognizing the mainstream media problem we have today as well. Without that real conversation, the solutions will be incomplete, untrustworthy, draconian, and the people overall will not see the outcome as just and the process legitimate.
That's rough, but real.
In addition, we've got many Americans looking for strong left economic policy reform. Often, that gets lumped in with "Fake News" which makes a mess of things in that draconian solutions are seen as chilling to otherwise reasonable, permissible dialog.
That's it, I will stop there, save for one last observation:
In a given community, one setup with the expectations I detailed above, or even some reasonable set of expectations that are clear and do not get in the way of most reasonable, real conversations, if pointed ones:
All it takes is a small percentage of regular contributors modeling the norms and the options to spread them throughout the membership, basically innoculating them against most basic meta and trolling bullshit.
The resulting conversation is healthy enough to filter truth from bullshit very reasonably.
Because people learn how to be wrong, how to handle attacks without exacerbating things, how to be secure in their conversation and seek enlightenment not always just validation and ego stroking. Spread that, and bullshit is displaced and or quickly marginalized.
(and often, bullshitters get sucked in, will own it, and begin to contribute too. Happens more than you think. And that happens, because the door for it to happen is held open, and appeals made. Come on, join us. We are all better for it...)
No machine learning needed. And scaling happens by natural community distribution. The very big platforms, Twitter for example, will be a mess because size. But, what nobody realizes is Twitter feeds tons of other communities. If those discussions are robust, the laughingstocks on Twitter will be known as such, and that bubbles right back to Twitter!
But there is a catch! It will do that for established big money interests same as it will nefarious politicians and bad actors trying to manipulate things.
So, what is worth what?
Do we really want robust, real, honest dialog, or is this really just about a cleaner, managed one like the status quo interests would prefer?
How do the people make the big money? That costs money to convey. Also no joke. Hard won information.
But, I have put enough here for others to go and do the work. If they do it, they can get the positive outcomes.
Honestly, we don't need to delete FB. The core of FB is a good thing. I'll leave my comment as it is for now, food for thought.
1. First, I've been observing various online trolling culture and fringe reactionary political movements for a few years. Mostly 4chan, but I think it applies to the web in general. While I don't agree with their intentions or motivations, but I found it's an extremely interesting phenomenon and pretty post-modernist in itself.
* The cyclist behind an anti-cyclist Facebook group
* Recommended comments (read the link below before you continue):
Here's a story: a biking hobbyist used his spare time and launched an influential anti-bike movement on Facebook. He posted numerous sensationalized commentaries, posters and memes to demonize bikers, including the various "data" and "charts" for the "bikes are a great threat of traffic order" propaganda, and occasionally advocated violence towards bikers in forms of half-jokes. He also proposed political reforms to introduce restrictive and/or discriminatory traffic laws for bikers. Soon, his campaign has gone viral and attracted a large number of supporters who drive cars. Well, not all bikers follow the traffic rules and some problems surely exist, but they are only as bad as the average day-to-day traffic and usually not seen as something people need to act upon, but this campaign exaggerates it, radicalized it and polarized the issue and created lot of hate. What is his intention then? His only intention is laughing at this fact: he was able to manipulate the car drivers easily and even create a raged populist political movement with a mere keyboard.
We hardly know who's behind the populist political movements online, but this is a well-documented case, and I believe it can be a representative sample which largely reflects the origin, emotions and motivation of the 4chan-like "for-the-lulz" trolling culture. Also, I can immediately point out the similarities to the populist reactionary political movements: the nationalism propaganda used many of the tactics the biking troll has used. Also, Russia is accused by the media to create an army of bots on social media, such as pro-LGBT, anti-LGBT, pro-life, pro-choice, religion fundamentalism, atheism, nationalistic, liberal, etc, with the intention of creating polarized argument and social conflicts in the United State.
On the other hand, the "for-the-lulz" trolling culture itself, even can be used as propaganda, is not propaganda in itself. It is a form of complicated web culture. It's basic definition is "doing some pointless activities to make others to suffer, and laugh at it", but it stands for a lot of things. It has a reactionary, or anti-establishment element, as previously mentioned, but it also includes: (1) cultural jamming, similar to the 70s counter-cultural movement (see those hoaxes in Fight Club), (2) It understands the art of exploiting a vulnerability in the system for one's own advantage, similar to parts of the hacker culture, (3) Also, it has a somewhat nihilist, post-modernist component, sometimes artistic (The Game was popular on early 4chan: once you remember playing it, you lose it), and (4), it reflects the worrying, helplessness and alienating aspects of modern life, and as an entertainment against them using humor and hate, then finally (5) a populist movement. It's core logic is: everyone, look how evil/degenerate this thing/person is, the justice needs to be done! Sometimes, it can be a legitimate popular movement and serve the right purpose, e.g. Occupy Movement, but other times, it's just a bunch of angry mobs lynching people. For example, a list of people on Encyclopedia Dramatica.
Especially, when I grasped these points, I found things went pretty stupid when, for example, the mainstream media started reporting Pepe the Frog as a hate symbol in 2016. Yes, it has connections to reactionary political movements, but itself is only a symbol of the trolling cultural. When the media reports it, it actually gives it a platform of exposure, and encouraged trolls to use it as a genuinely political symbol, and personally I think it makes the issue worse.
What I'm trying to say is, to solve problems like this, we must study and understand the entire machinery of it, including the underlying social-economical basis of this conflict using sociology and psychology, and to provide social solutions to the problem. However, the only proposal the mainstream media (or us, as online members of online communities) has right now is "Twitter and Facebook should ban hate speech", or "we should not feed the trolls", which hardly touches deeper problems.
If you want to understand more about the trolling culture and its mechanism, I recommend "This Is Why We Can't Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture", by Whitney Phillips, published by The MIT Press.
Here's are related comments on this issue I have posted before,
* Twitter has barred hundreds of accounts for posing as liberal activists
My comment: https://news.ycombinator.com/item?id=18239216
* RIP Culture War Thread
My comment: https://news.ycombinator.com/item?id=19231994
This one is mostly a repetition, but it includes one more book recommendation. I also recommend to read the original web article. It has covered a lot of related discussion.
2. I was also trying to characterize online communities in terms of its medium, feedback mechanisms, and filtering mechanism. A online platform has a lot of content, but you only see a subset of content. The filtering mechanism determines which subset it is. We could say, the first generation of online community is mailing list, Usenet newsgroups, and web forums. It only has a simple filtering mechanism: grouping the content by topics, and a kill file (a people/keyword blacklist, sometime shared). It works relatively well, and has the advantage of producing diverse topics, free-flowing discussions, but there's nothing to stop low-quality discussion, so a good signal-to-noise radio must be carefully maintained. Another problem is the lack of feedback: when nobody reply for post, it means five different possibilities, see Warnock's dilemma (https://en.wikipedia.org/wiki/Warnock%27s_dilemma). Another important issue is: the entire forum basically stops scaling when the number of users exceeds 10,000 or so. The information is too much to handle for any individual.
Platforms like Reddit or Hacker News can be seen as the second generation, it introduced voting as a feedback mechanism, which simultaneously (partially) solves the problems of content recommendations, restriction of low-quality content, and scaling. The "karma" system of Slashdot also counts, but it has a innovative "community moderation", so I guess it's 2.5 generation. Anyway, overall, its usefulness of voting as feedback is great and now becomes an integral part of many communities, but not without its own problem. Voting creates a strong incentive for people to collect upvote. People may stop talking about insightful contrarian views, which otherwise can generate an intellectual interesting discussion in a newsgroup. Also, voting seems to be more suitable for link/media recommendations, but less suitable for discussions. Many subreddits has thousands of votes for links but a single digit of comment.
Finally, ~2007, here comes the social media. Social media is unique because it leveraged the underlying human social network as a means of the filtering mechanism, and has a somewhat "decentralized" model of publishing. Unlike a forum where everyone post to a single directory, on Twitter, you post to your own account, and others can simply follow a number of people, then interesting posts simply circulate by itself. The system is highly effective, I love social media. Unfortunately, problems. First, the content has to be short, no long discussion is possible. This is usually not a problem, but the social media has slowly killed most forums in the process and became the center of the web, forcing people to use for discussion, although the social media is an awful platform for this purpose. Its structure is conversational, posts don't have a single thread, but a dozen of microthread. A replies to B, but this reply is usually ignored by C, which may have crucial context. Furthermore, the volatile nature causes many problems, A mentioned X 3 hours ago, and now X is missed by most readers. Or, A posted X and Y, but often only a single X/Y is seen by other readers, while the other one has important context. Also, the social media has the strongest filtering bubble effect compared to all platform, which may create an echo chamber, increase political polarization among other effects.
But think about it. Voting was invented by Reddit in ~2002, social media is a thing from 2007. Can we design a new generation of communities with innovation? A better alternative "filtering mechanism"?
3. I noticed a lot of online communities seems to go downhill. What is going? I have a comment.
Back later when I can consider it properly. You dropped a lot. Great, but I have to plow through it and will.
We have a common interest apparently.
(I have a lower 5 digit account over there)
I'm looking forward to reading this, for sure.
But I gotta say that Greg Egan did a good job of this, at the beginning of "Diaspora". Which he published in 1998.
Also interesting is Kevin MacArdry's The Last Trumpet Project, which goes further. Read-only access to the past permits bringing everyone who ever lived into digital heavens. It came out in ~2010.
This is a very interesting quote to me because it's the type of quote that "the oligarchy" uses to label those like Alex Jones wing nut conspiracy theorists.
Thus if we really are going to blame social media, it’s because we aren’t more selective about the social networks we embed ourselves in. But this of course is a problem that existed from the dawn of humanity.
- boosting things that are already popular increases their measured popularity even more, which both
a) over-emphasizes things that get reacted to quickly, which means e.g. quick outragous headlines get even more benefit over more in-depth coverage than they already do by human nature and
b) over-emphasizes popular sources over niche ones.
- Facebook actually does guesses at audiences, which means liking something can actually mean you decrease it's reach with the intended audience. (the "Facebook mom problem": if you post e.g. about your latest scientific paper and your mom likes it immediately, because that's what moms do, Facebook assumes it's family content and is less likely to show it to professional contacts that are the actual audience)
- for commercial content, Facebook intentionally limits reach among people who explicitly choose to follow you unless you pay for visibility
Neither is secondary; both are essential to the socio-algorithmic evolutionary process.
The algorithms are intended to optimize toward some user group behaviour. As the user group's choices influence the behaviour of the algorithms, so too do the algorithms' choices influence the behaviour of the user group.
> One thing we do know about quantum computers is that once we can get them to work, which is no small task, they will be unbelievably fast and increase available computing power by orders of magnitude compared to traditional computers
> The near-future world of Fall is full of familiar buzzwords and concepts. Augmented reality headsets, next-gen wireless networks, self-driving vehicles, facial recognition, quantum computing, blockchain and distributed cryptography all feature prominently.
That is not such a terrible outcome, but neither is it an especially good outcome. The quality of my e-mails and public speaking is, in my view, nowhere near that of my novels. So for me it comes down to the following choice: I can distribute material of bad-to-mediocre quality to a small number of people, or I can distribute material of higher quality to more people. But I can’t do both; the first one obliterates the second.