Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Christchurch shooter was radicalized on YouTube, New Zealand report says (theverge.com)
56 points by pseudolus on Dec 8, 2020 | hide | past | favorite | 103 comments


I think we're lying to ourselves when we look for answers that don't acknowledge systematic failure modes in socialization of today's youth. The baddies du jour isn't the root cause, just a link in a long causal chain. It's a cop out to say focusing on the baddies du jour is part of a multi-pronged approach, the other prongs aren't there or consist of perfunctory hamfisted interventions.

But it's not really a surprise, our current approach to mental health is one of sweeping problems under the rug and hoping the rug doesn't notice. It's treated as a heartwarming story when someone is talked down from the side of a bridge, because comparatively nobody notices that they spend the rest of their life at a just-below-jumping threshold level of misery.


> current approach to mental health

This is a red herring; nowhere in the section of the report on him does it describe him as being mentally ill, and it was not a sentencing factor at his trial (https://www.bbc.co.uk/news/world-asia-53919624)

https://www.stuff.co.nz/national/christchurch-shooting/12258... : ". He said after being assessed by the psychologist and psychiatrist, there was no evidence the terrorist suffered from any mental disorders or psychiatric conditions, or that he showed any clinically significant cognitive impairment.

The psychologist determined he displayed a range of traits akin to a personality dysfunction, but not to such an extent as to constitute a personality disorder"

> systematic failure modes in socialization of today's youth

This is rather complicated, because we're talking about radicalisation; he was "socialized" (or at least brought to those views) by the wrong adults, which he chose himself.


He's also an adult himself and I believe was an adult when he was radicalized so that bit on "socialization" makes even less sense.


> But it's not really a surprise, our current approach to mental health is one of sweeping problems under the rug and hoping the rug doesn't notice.

But I would say that people with mental health problems will always exist too. And we absolutely should also focus on those who intentionally push them toward extremism and then toward violent action.

Because trying to pretend that issue is just mental health or just socialization is not accurate either.


Events like this are a gold mine for opportunistic policy makers and think tanks.

The initial read on Ted Kazinsky was flat wrong. He was practically manufactured. I’d like to know more about the origins of Brenton Tarrant.

Both took the time to write lengthy manifestos which, if you read, were more than what you’d get from a YouTube binge. Ted, was, somewhat of an original thinker. Tarrant less so. Fascinating.


It's a little bizarre that Kaczynski was a published mathematician of some note before he fell afoul of MK Ultra

https://medium.com/cantors-paradise/the-mathematics-of-ted-k...


You see the problem here is that crazy has many forms. Like Ted Kazinsky crazy can be smart and intellectual. The writings seem to make sense, especially for a young mind, and people develop an empathy for the cause. People read his manifesto and others like Mein Kumph and sometimes people, especially growing minds, seem to think it makes sense.

YouTube and other videos are much worse because you have full scale productions instead of books and letters. Do you remember the ISIS state videos? Where everything looked peaceful and welcoming? Then when people go there it was hell on earth, some raped, some killed, some even worse.

There’s a line we haven’t figured out between free speech and hate speech. Ted’s manifesto is more like free speech but the ISIS stuff and Nazi stuff is pure hate.


> Mein Kumph


Why would it be mental health? It’s a idealistic terrorist plain and simple. The very fact that you would suggest mental health issues is deeply disturbing.


It’s a bit jarring when the excuses pour out of the woodwork for white lone wolf shooters and terrorists. I don’t disagree with what you say about systemic failures in the health system, but on the other hand, it ignores the crisis of white, young men getting radicalized at alarming rates.


If you've had a family member or a friend go down the rabbit hole, you'll understand both the zealotry and the difficulty in peeling it back. Here's what I've observed:

Youtube videos of sermons & bible study -> youtube videos of sermons preaching the end of the world is nigh -> youtube videos of deep state / conspiracy discussion -> youtube channel for a particular conspiracy theorist -> alternative website that isn't youtube[0] to consistently dole out conspiracy information (end state)

The end result a couple years later is that it now sounds crazy to suggest that Dr Fauci is not a mass murderer, to suggest that the US is not currently engaging in a shadow-WWIII with China, and to suggest that covid-19 is not a bioweapon produced by China. Forget about peeling back "ordinary" disinformation like the kind that gets flagged on twitter.

[0] For example, https://en.wikipedia.org/wiki/Guo_Wengui#G_News


Sorry for asking, but is this shadow-WWIII supposed to be a hot war? An economic cold war seems obvious from my steady diet of such conspiracy-minded websites as Reuters.

(and if covid-19 were a bioweapon, why would people be anti-mask anti-vax?)


> Sorry for asking, but is this shadow-WWIII supposed to be a hot war?

Well, I try to avoid talking about politics with her, but last I heard the US is "soon" going to be assassinating the leadership of China and potentially it will be sparking a nuclear war.

> (and if covid-19 were a bioweapon, why would people be anti-mask anti-vax?)

Theres different groups of conspiracy theorists. In this case they are not anti-mask at all, although for vax I was recently warned we don't know if the vaccine is actually part of China's ploy to infect people so we should be cautious.


Ah, ok. Good luck!


> Well, I try to avoid talking about politics with her, but last I heard the US is "soon" going to be assassinating the leadership of China and potentially it will be sparking a nuclear war.

https://en.wikipedia.org/wiki/Assassination_of_Qasem_Soleima...

On 3 January 2020, a United States drone strike near Baghdad International Airport targeted and killed Iranian major general Qasem Soleimani while he was on his way to meet Iraqi Prime Minister Adil Abdul-Mahdi in Baghdad. Soleimani was commander of the Quds Force, one of five branches of Iran's Islamic Revolutionary Guard Corps (IRGC), and was considered the second most powerful person of Iran, subordinate to Supreme Leader Ali Khamenei. Four Iranian and five Iraqi nationals were killed along with Soleimani, including the deputy chairman of Iraq's Popular Mobilization Forces (PMF) and commander of the Iran-backed Kata'ib Hezbollah militia, Abu Mahdi al-Muhandis – a person designated as a terrorist by the U.S. and the United Arab Emirates (UAE).

Indeed the variables (and values of those variables) are not identical between the two scenarios (of which the Soleimani incident is just one example from a very long historic list), but anyone who interprets such concerns about geopolitical relations with China as "crazy" might want to turn their critical lens on themselves and see if anything appears (and while doing so, one should keep in front of mind that seeing faults in one's own thinking is far more difficult than seeing it in the thinking of others).

To me, those who enthusiastically and confidently underweight risk are a bigger danger than those who overweight it, and from an aggregate societal perspective, there are a hell of a lot more of the former than the latter. At the end of the day, the amount of actual risk in the system, is the amount of actual risk in the system, which is distinctly different than how most people perceive it, which seems to be roughly that the amount of risk in the system is that which someone says is in the system (aka: an opinion). Reality is risky/problematic enough as it is - layering on this unnecessary phenomenon of completely mistaking our [opinions of X] for [actual X] (and getting angry and anyone who dares to disagree) exacerbates the problem.


This is getting a little off-topic, I was just answering a question. I don't consider the existence of risk of conflict with China to be misinformation, what I consider misinformation is interpreting Trump's december 2nd speech in which he mentions "waging war on slavery" as a confirmation that the missiles will soon be flying.

If you want to understand what kind of misinformation exists on GNews, please feel free to peruse it and see for yourself.


That’s funny there are signs in my neighborhood that say “PEDOPHILE ALERT” with a Gnews link.


>conspiracy theorist

The confirmed CIA operations are enough to make someone a radical. To suggest the US hasn't been in a shadow war with china would require ignoring what our intelligence agencies have been continuously telling us.


It would not be difficult to assemble the same kind of narrative based on plentiful evidence of the same abstract psychological behavior, except from the radical left.

I happened to read a BLM post on Medium once, and I now get a daily email of several blog posts that are at least as delusional as the above beliefs about how all white people are not just racist, but fundamentally and immutably racist.

Underappreciated (and rarely discussed in the media) is that this sort of radical content amplifies the sentiments of those who are on the alternate end of the spectrum (and vice-versa), in an ever intensifying negative feedback loop. I often wonder if politicians and forum/media commentators are completely oblivious to this notion, or if they are aware of it but have some sort of a "the ends justify the means" thing going on. I also often wonder why there is essentially zero initiative whatsoever to reach out to these communities and establish some sort of a dialogue, as is done with peace talks with warring terrorist organizations - again, has this idea literally never crossed a single person's mind...or has it, but the utter lack of public discussion of the idea is a ~"strategic decision" of some kind? (That it never seems to come up even in forums suggests the former is the more likely of the two explanations.)

Human consciousness is a hell of a drug.


> It would not be difficult to assemble the same kind of narrative based on plentiful evidence of the same abstract psychological behavior, except from the radical left.

To be fair, I do think this is an underappreciated point. Despite the fact that I'm a leftist, I do see lots of misinformation coming from the left, and I've seen leftist facebook groups that quickly devolve into memes about killing landlords or other extremist drivel.

> I happened to read a BLM post on Medium once, and I now get a daily email of several blog posts that are at least as delusional as the above beliefs about how all white people are not just racist, but fundamentally and immutably racist.

Although I agree the left can spew out all kinds of BS, the difference here is that your example is lukewarm extremist content at best. This kind of content paints wild ideas but does not contest reality or directly incite weird actions. Here's some examples of what I mean:

* (contesting reality) My family member who reads GNews will sometimes randomly start conversations with absolutely out-there tidbits of misinformation like "oh my god, did you know JFK Jr is still alive? I just saw a video that proved it"

* (inciting actions) My family member who reads GNews went out of her way to acquire HCQ as a "just in case", and also tried to convince family members in China who had a fever (unconfirmed covid) to take it without even seeing a doctor. At some point she considered taking HCQ as a preventative for covid despite being being at nearly 0 risk of exposure (never going outside)

> I often wonder if politicians and forum/media commentators are completely oblivious to this notion, or if they are aware of it but have some sort of a "the ends justify the means" thing going on.

I strongly suspect this is the case.


> the difference here is that your example is lukewarm extremist content at best

That is "a" difference, not the difference

> This kind of content paints wild ideas but does not contest reality

It describes reality in ways that are not actually true.

> or directly incite weird actions.

You do not know the direct or indirect consequences of such content. Understanding causation in a complex, chaotic system is far beyond the cognitive capabilities of the human mind. To make it even worse, the mind typically (and incorrectly) perceives itself to be capable of such calculations.

> Here's some examples of what I mean

I belong to a wide variety of communities - I could also provide similar examples of silly thinking from those who fall into various categorizations. I don't disagree that the "far right" types may indeed pose greater risk (but to be clear: whether that is actually true, is unknown, and a body count perspective strongly suggests otherwise [1]), but to speak as if they are the only risk worth worrying about is itself contesting reality.

Reality is infinitely complex, but the conscious mind does not perceive it this way. Even worse, it (the mind, which is in ultimate control of each comment posted on forums like this) seems to refuse to even consider (almost without exception) that it is infinitely complex. I believe that this may be much more dangerous than it may appear (to the mind...which may happen to be just a little biased on the subject).

[1] Here I refer to war, in general, which is not the fault of "the far right", who seem to very conveniently get all the media attention.


I wouldn't say only far-right extremism is dangerous, any extremism that incites actions and contests reality is dangerous.

Gnews (or atleast the Guo stuff) is arguably not even far-right, most of their content is orthogonal to US left/right divisions and they just happen to be supportive of Trump as Trump is vocally anti-China (and because they allied with Bannon in the past year or so)


To clarify: have you modified your opinion on the matter such that you now no longer consider far right beliefs to be especially dangerous, but rather only note that they are one of many risks, and that the true risk in the system is unknown and largely unknowable due to the inherent complexity of a large chaotic system?


> have you modified your opinion on the matter such that you now no longer consider far right beliefs to be especially dangerous, but rather only note that they are one of many risks

This was always my belief. If you examine my original post[0] I do not even discuss left/right framing, your response to my post introduced it.

[0] https://news.ycombinator.com/item?id=25345741

> and that the true risk in the system is unknown and largely unknowable due to the inherent complexity of a large chaotic system

I hadn't previously considered where the "true risk in the system" lies because my concerns with misinformation are more local (will my mother-in-law do something crazy this week?), but yes I would say that I think this is a reasonable assessment. If you're thinking of society as a system, then it is impossible to determine precisely which elements introduce risk of social instability/destruction for precisely the reason you mention.


Ah, my mistake...I attributed a portion of the general topic to your comment (ah, the sweet irony)!

Rather than far right, I actually meant the subculture that you (seemed) to be referring to. But your last response here well illustrates that you seem to appreciate the complexity (and uncertainty) of the system, which is the fundamental idea I was hoping to introduce into the memeplex.


In my day it would have been heavy metal and dungeons and dragons. If that seems glib about a terrorist outrage that cost dozens of lives and families, we should apply the same standard of gravity to the quality of the critical excuses authorities make for it afterward. There is a youtube rabbit hole, but it has nothing on the previous tumblr hall of mirrors, or the twitter chorus. Tech is a great scapegoat because it's not going to defend itself. I suppose part of moving on is finding a way to co-opt negative experiences and apply them to strengthen your agenda, but from politicians, it still seems like cynical cant.


Heavy metal and Dungeons and Dragons did not actively push far-right extremist content at people. There is absolutely no comparison to be made there.


There's no comparison because it's now near-universally recognized that the moral panic on those topics was silly. But many people in the 80s were pretty convinced that heavy metal and fantasy role playing could drive you insane or make you start worshipping demons. It was the subject of Tom Hanks's first movie! (https://en.wikipedia.org/wiki/Mazes_and_Monsters)


Yes, but that was not true.

Youtube really does have far-right extremist content on it. It really does drive you towards it. This is something happening in reality.


It was true in part! Judas Priest has a quite emphatic, criminal-friendly message on the topic of Breaking the Law. The question - in both cases - is whether the very real unsavory messages actually cause people to do terrible things they would have otherwise avoided.


Actively? It sounds like you're erroneously attributing some sinister 'purpose' to the algorithm.

The YT recommendation algorithm doesn't have any preference for what it sends you to, only that you keep coming back.

The music industry might do something similar (simpler, poppier tunes), but it's far, far slower than YouTube, so you can accumulate less 'radicalisation' on aggregate.


Actively, as in, it is pushing you towards this content. You do not have to seek it out specifically, the algorithm will steer you towards it.


Ah, okay. Sure, that's what any recommendation system will do. But it's not like how Netflix 'actively' (might not be the best word to use, then) steers you towards Netflix originals because Netflix prefers you watching those.

That's what I thought you meant, but it looks like I was mistaken.


At some point the decision to not act on a problem becomes an active choice in favour of the problem, though.


But I think that's the point. Because Youtube has such a fast feedback loop, they have a special responsibility to avoid over-optimizing on getting you to watch the next video.


Neither does YouTube, I think that was the comparison. I'm sure you could find all sorts of right-wing, left-wing, Islamic, or other extremist content if you try hard enough. To say YouTube pushes it on people is just not true. If anything, their content moderation policies and algorithms are too strict.


But it does. It really does, it is well documented that it does.


I know a few journalists really want to push this narrative, but I think the more it's been carefully studied, the more it's been debunked. E.g:

https://arxiv.org/pdf/1912.11211.pdf

"After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims. To the contrary, these data suggest that YouTubes recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels with slant towards left-leaning or politically neutral channels."


Dungeons and Dragons advocates for:

* Innate race-specific characteristics [1] steeped in centuries of colonialism

* An antiquated notion of intelligence (INT) that neglects other types of intelligence (emotional, linguistic, spatial, etc.)

* A strong focus on collecting currency (gold, silver, platinum) and hoarding items that exacerbate wealth inequality

* The illusion of a meritocracy where higher level characters have "earned" their rank and enjoy the benefits of easier gameplay

* A problematic reference to Adolf Hitler's charisma [2] that was claimed to be misquoted but was still stricken from further editions of the manual

The list goes on, but it should be easy to see how D&D can be a gateway to further radicalization and dehumanization.

[1] https://www.polygon.com/2020/6/23/21300653/dungeons-dragons-...

[2] http://www.theescapist.com/basic_gaming_faq.htm#hitler


D&D has its problems, yes. But it is nothing like Youtube in how strongly it drives radicalisation.


I don't remember the "Tumblr hall of mirrors" or the "Twitter chorus" leading to mass shootings.


There is no report yet to tell that. I think YouTube is just a scape goat for the society problems that led to that terrible event; any day someone can come forward with a theory that the bad things on the planet are caused by X, Y or Z where X, Y or Z can be anything from Youtube/Twitter/FB to video games or drinking beer, then the reaction will be to try to ban that. I saw the ban that was the result of this event, it was illogic to anyone with knowledge of the matter, but it was well accepted by uninformed public.


Came here to say that. Thank goodness for YouTube! There are no records to spin backwards anymore to get the latest marching orders from Satan.


As a society, we learned a lot of creepy lessons from Orwell's 1984. But we didn't learn the positive lesson - a society needs a controlled way for rebelling for the sake of rebelling.

Heavy metal, punks, hippies.. What's up today? Now it gets harder and harder to rebel as more and more things are becoming "normal". So people go more extreme.

Soon biggest rebellion will be to do nothing "rebellious". Go to school, don't party too much, stick to classic conservative ideas, maybe go to church (or equivalent), dress in a shirt and jeans, go for family/kids/house with a white fence.


i've got a preteen son who spends time on youtube, and I can't block it with pihole because there are tons of really awesome things on there. Currently he's geeking out on animal facts.

What I've done as a bit of a middle ground is to use ublock to strip away the recommendation sections and comments. He can still search and watch whichever videos, and even look at all the videos on a particular channel, but at least he's not getting sucked down a rabbit hole. I've had the conversation with him that its not that I don't trust him rather I know how distracting those things can be and have found them detrimental even to my own attention/life.

Of course he'll probably go around them eventually as he learns how things work and finds some motivation to do so, but in the meantime he gets a cleaner experience of the internet.

FWIW these are the rules. They might be breaking something, but they're working alright for now.

    www.youtube.com###related
    www.youtube.com##.ytd-comments
    www.youtube.com###content > .ytd-rich-item-renderer.style-scope


You can use Google family link to force his account into parental control and force Youtube into safe mode (if he's logged in). You can also force this using DNS to prevent him from accessing non family friendly content.


DNS will only let me block specific domains, right? So I can't do blocking of specific channels that way (though I do use a nice big blocklist for adult content [0])

Google is a bit of a non-starter for me unfortunately. Their tools for protecting kids' eyes from content are probably great, but then their just building a catalog of my kids' viewing habits from a really young age... which I find off-putting. Maybe if that was the only way to block I'd go down that road, but there are others.

[0] https://raw.githubusercontent.com/chadmayfield/my-pihole-blo...


The report is huge, but a couple of sections that might be of specific interest are

https://christchurchattack.royalcommission.nz/the-report/par... : someone visited his rifle club beforehand and reported various red flags to the police - literally, Confederate flags, "militia" uniforms, racism

https://christchurchattack.royalcommission.nz/the-report/par... : a discussion of the existing NZ (and international!) laws on hate speech and incitement, and suggestions for tuning them.


> This is not the first time YouTube has been linked to radicalization and white supremacist content. There has been an ongoing argument about whether YouTube’s algorithm pushes people toward more extreme views over time.

I can see this in the small tech bubble I’m apart of. Right now I’m bombarded with Apple M1 is the greatest ever!

If I start to watch news stuff it seems like I go deeper and deeper into the videos that trigger the strongest emotional responses. Same with the Tech bubble, OMG Apple or Google did what!?!


Well, the M1 is spectacular, so the algorithms are working correctly.


It works both ways. I remember antennae-gate and bend-gate blowing up at the time. I had both phones and never had an issue with either. If I believed YouTube both my phones were garbage and going to break.

Can’t wait for my M1 air to come in.


Roll this event back 25 years and people would be claiming that he was radicalized by Satanic heavy metal played backwards. Or jazz and marijuana. Or that Protestant heresy.


Those things did not exist.

Far-right extremism on Youtube actually exists.


I assure you jazz does indeed exist.


Jazz that radicalises you to mass murder does very much not exist.


Music has never radicalised anyone, nor video games. There's only a type of music (games) people listen to (play) who then go on to do radical things.


What?


Sorry, I mixed up comment chains. Someone referred to how black metal bands in Norway burned churches and people routinely claim violent video games cause violence, but I'm saying the causality goes the other way. Church-burning Norwegians 'cause' black metal, rather than the other way around.


To be fair, this happened, note the "Church arsons and attempts" chapter: https://en.wikipedia.org/wiki/Early_Norwegian_black_metal_sc...

In 1992, members of the Norwegian black metal scene began a wave of arson attacks on Christian churches. By 1996, there had been at least 50 attacks in Norway;[2][27] in every case that was solved, those responsible were black metal fans.


Kind of. Everyone involved with the attacks seemed to be directly connected to the local black metal scene at the time, but it's not like the music itself inspired people worldwide to burn churches.


> it's not like the music itself inspired people worldwide to burn churches.

unfortunately...

https://www.nytimes.com/2020/11/03/us/louisiana-black-church...


>Mr. Matthews has also said that he was copying similar crimes committed in Norway in the 1990s in an effort to elevate his status among the black metal community,

His entire reason was to ingratiate himself with the scene, the music still had little relevance.


It's much more likely that the desire to burn churches makes you more likely to make black metal/be a black metal fan than that black metal makes you want to burn churches.

It's reverse causation, kind of.


.. who were also of the far-right. Which is the real linking factor here.


Music can be and in practice is vector for far-right propaganda too. Or I guess far-left, except I knew about neonazi bands when I was younger and did not knew about communist bands. By neonazi bands I mean bands whose lyrics propagate those ides. Distinction without difference in this case, basically.

It was not metal, it was rock in that case. But here, those metal bands were also expressing their ideas and opinions in their songs.

There was such a thing as disproportional moral panic around metal in USA. And there was also such a thing as metal bands actually being like that. It is like ... there was satanic moral panic where a lot of innocent people ended up suffering. And then there was real world child abuse going on too.


Roll back three or four centuries, and one can find claims about radical Quakers:

https://news.ycombinator.com/item?id=23859546

"In its first few pages, the book accuses the Quakers of obscenity, adultery, civil commotion, conspiracy, blasphemy, subversion and lunacy. Milton was not out of fashion in applying bad manners to propaganda. It is merely regrettable that he did not transcend the frailties of his time."


No comparison for a variety of reasons least of which being that a lot of this propaganda reaches a lot of the older folks as well.


If you listened to metal you didn't go out and shoot Muslims. On YouTube you have people deliberately manipulating facts to convince people to follow their extremist views. You are comparing apples to oranges. You should compare the history of propaganda media such as radio (Nazi's and Rwanda), TV and the internet.


I've been on this journey myself, not that I'd ever harm anyone, but I was definitly radicalised by some of the far-left areas of YT when I was younger. I lost a number of friends because I was so convinced I was morally correct and they were wrong on topics like immigration and capitalism.

I'm aware of a few cases now where subscribers of TYT have killed people, but it's hard to know how direct those conncetions are. I know for me TYT definitely misled me and caused me to believe everyone who didn't share my views were evil and I have wondered before what I might have done had I been a violent person.

I don't really know what the answer to this is. I know for me YouTube was also part of my deradicalisation process. Listening to people like Sam Haris allowed me to shift into the centrist I am today.

Given the very low numbers of people who people who commit violent acts like these part of me wonders if we exaggrate the problem. Young men have always been fairly poltically radical. I think it would be incorrect to assume radicalisation is any higher today because of the internet. These types of events have happened for a very long time.

I know often findings like this are followed by calls for censorship, and it does bother me today when I see content online obviously lying or misleding people about news stories to push a certain narrative. Still, given how poor of a job YouTube and factcheckers have done at trying to combat this stuff (as they often also have their own political agenda) I'm personally not convinced censorship is the right answer.


>I've been on this journey myself, not that I'd ever harm anyone, but I was definitly radicalised by some of the far-left areas of YT when I was younger. I lost a number of friends because I was so convinced I was morally correct and they were wrong on topics like immigration and capitalism.

I can sympathize. It's worse off in Canada. We literally watched far left stuff in school. Not once did they ever show anything even marginally conservative.

>I'm aware of a few cases now where subscribers of TYT have killed people, but it's hard to know how direct those conncetions are.

I did not know that. Are you refering to Armenian Genocide or like TYT channel has created violent antifa type people?

>I know for me TYT definitely misled me and caused me to believe everyone who didn't share my views were evil and I have wondered before what I might have done had I been a violent person.

I too saw many of their videos years ago and they made lots of sense. They quite wrongfully portrayed Republicans as all nazis and KKK. Or at least a minimum of racists. Mind you, I'm not American nor do I live there. I don't particularly care about the republicans. For all I know they are all racists.

>I don't really know what the answer to this is. I know for me YouTube was also part of my deradicalisation process. Listening to people like Sam Haris allowed me to shift into the centrist I am today.

I followed Sam before TYT era mainly during the Hitchens and Dawkins time. Sam is very smart but I've been listening to him lately and boy is he ever deranged on Trump. I suppose every president has the Clinton Crazies equivalent. It's very interesting to see Sam not rational about Trump.

Everyone is vulnerable to this radicalization.

The interesting thing, the very first republican president. Lincoln. He has a policy of like Ben Franklin. "Speak ill of no man, but speak all the good you know of everybody."

Yet he had a derangement like no other. Ended up creating a civil war and getting assassinated.

>I know often findings like this are followed by calls for censorship, and it does bother me today when I see content online obviously lying or misleding people about news stories to push a certain narrative.

That's completely happening. Even wikipedia has politically sided. https://larrysanger.org/2020/05/wikipedia-is-badly-biased/

>I'm personally not convinced censorship is the right answer.

It's definitely not. Here in Canada we absolutely have extensive censorship. VPNs are becoming increasingly popular for Canadians.


Harris is a nice example of why "smart people" need to be very careful. Being "smart" in this sense usually also means you are very good at rationalising your beliefs.


> I did not know that. Are you refering to Armenian Genocide or like TYT channel has created violent antifa type people?

Gavin Long is probably the most clear cut example I'm aware of, but we know others like Arcan Cetin were TYT subscribers and some of suspected he was politically motivated. Elliot Roger was also a TYT fan, although I see no connection there.

> I followed Sam before TYT era mainly during the Hitchens and Dawkins time. Sam is very smart but I've been listening to him lately and boy is he ever deranged on Trump. I suppose every president has the Clinton Crazies equivalent. It's very interesting to see Sam not rational about Trump.

I know what you mean. Personally I think his views on Trump are reasonable, although the way he talks about Trump sometimes makes me feel he's too emotionally invested in the subject. I wish his rants about Trump had more facts and less ad hominems, there are others who make a much better case against Trump than Sam.

> It's definitely not. Here in Canada we absolutely have extensive censorship. VPNs are becoming increasingly popular for Canadians.

It's the same in the UK. We have to be quite carefully about what we say publicly online because if someone gets offended about our opinions (regardless if we intended offence) we can be arrested for a hate crime. I remember when people used to talk about how crazy it was that the Chinese government censor the internet and arrest people for having the wrong opinions, but this has become fairly normal in the UK today.


>Gavin Long is probably the most clear cut example I'm aware of, but we know others like Arcan Cetin were TYT subscribers and some of suspected he was politically motivated. Elliot Roger was also a TYT fan, although I see no connection there.

Being a fan is necessarily a TYT made me do it.

>I know what you mean. Personally I think his views on Trump are reasonable, although the way he talks about Trump sometimes makes me feel he's too emotionally invested in the subject. I wish his rants about Trump had more facts and less ad hominems, there are others who make a much better case against Trump than Sam.

I think it's not reasonable. He will say Trump is a racist but then he gets asked what makes him think he's a racist. "He just knows" he cant actually give an example. Meanwhile after he bought maralago he opened it up to non-whites. Literally the opposite of a racist. Sam then says, "He did that for business reasons."

Or you look at covid timelines. February Trump shutdown flights from China. You could still go to the usa you just needed to reroute. From which he was called a racist and xenophobic. We even had Canadian leaders call him xenophobic.

On February 3, at a time when the US and Australia had excluded foreigners who travelled from China, Hajdu denounced "the spread of misinformation and fear across Canadian society" and called on the opposition to "not sensationalize the risk to Canadians",[15][16] while Tam endorsed the position by the World Health Organization, which "advises against any kind of travel and trade restrictions, saying that they are inappropriate and could actually cause more harm than good in terms of our global effort to contain" the disease.[15]

But then in the very next sentence "Trump is one of those guys who wants to shutdown the border and build a wall but he wouldnt shut it down for covid because economy."

But you cant call him xenophobic for shutting down and then also give him shit for not shutting down more.

>It's the same in the UK. We have to be quite carefully about what we say publicly online because if someone gets offended about our opinions (regardless if we intended offence) we can be arrested for a hate crime. I remember when people used to talk about how crazy it was that the Chinese government censor the internet and arrest people for having the wrong opinions, but this has become fairly normal in the UK today.

Here in Canada we have comedians who made dark jokes over a decade ago being found guilty of 'inciting hatred'

You have the left wing labelling every right-wing politician as 'racists' not because they are racists but because that's how you silence your political opponents.


I've been thinking of YouTube as a great deradicalisation tool, because it tends to show that "they" are way more like "us" than we tell ourselves.

Has anyone been thinking of ways to set up deradicalising rabbit holes? I guess the basic problem is "the algorithm" would tend to push viewers away from those kinds of burrows...

Edit: I've been hoping that anti-"authoritarian follower" education might be helpful, as in https://news.ycombinator.com/item?id=24807977


It's not a simple issue to address. People who become radicalized often don't trust the popular narrative, often for pretty legitimate reasons.

If YouTube is too heavy-handed with their attempt to deradicalize, they're likely to just end up untrusted like a lot of media.


Is there any indication that there is some specific acute problem, such as an increase in percentage of society that becomes radicalized/violent, over other times in human history?

Is this a "something must be done!" because it's on the news, or because it's an actual new problem warranting special attention?


The acute problem I see is heavily armed aggro people driving around in heavily-flagged technicals. (albeit Dearborn, not the more commonly found Aichi, technicals)

When the soviets decided that whatever their System was doing, it was not doing what they had been brought up to believe in, they changed it. In russia specifically, wikipedia says 3 people were killed during that change. (Romania was the most violent by far of the Warsaw Pact, with O(1000)) The US has not only stayed in afghanistan for far longer than the USSR, it's clearly had more than 3 people killed in political violence so far, and I am not optimistic enough to think they will get to a system "with liberty and justice for all" without adding any more...

But I am a cynic, so my fingers are crossed that I am wrong.

Edit: the "3 people" are from https://en.wikipedia.org/wiki/1991_Soviet_coup_d%27état_atte... ; please let me know if I am missing any more violent episodes.


> The US has not only stayed in afghanistan for far longer than the USSR

Don't forget about the US "sojourns" into LatAm! There's lots more where that came from.


Yes. Historically, new forms of mass communication have been always been used to stir up insanity and violence, dating back to the invention of the printing press. Wide distribution of Martin Luther's antisemitic screeds led to pogroms across Europe, the Nazis picked up where he left off using the new inventions of film and radio broadcasts, and now violent, racist ideologies are once again gaining mainstream purchase thanks to platforms like YouTube.


Ok, so what are the indications that violent, racist ideologies are spreading faster or to a greater degree today than at other times?

Furthermore, if that is indeed the case, what makes you think that publishing platforms are the cause?

I'm skeptical of the first, and I am doubly skeptical of the latter.


>Ok, so what are the indications that violent, racist ideologies are spreading faster or to a greater degree today than at other times?

Read the news? There’s been a huge upswing in far right politics dominating democracies worldwide, post-2008. The world’s largest democracy, India, is currently controlled by an explicitly enthnonationalist party whose leader has praised Hitler. The US, Australia, Hungary, Austria, Brazil, Japan, and many others have lurched further rightward.

> Furthermore, if that is indeed the case, what makes you think that publishing platforms are the cause?

In historical terms, there is not ever a single cause, but multiple forces coalescing to push things in one direction or another. Obviously, the 2008 financial crisis and it’s impact on the world economy created a ground that was fertile for all kinds of populism. Because of their long standing alliance with Capital owners, however, the right was better positioned to make use of these technologies as they emerged. You can also read any number of stories about the role that Facebook, WhatsApp, YouTube etc served in successful right wing political campaigns across the world. Google “”WhatsApp Brazil Bolsonaro” or “Trump 2016 Facebook” of you need further insight. It’s pretty much common knowledge at this point.


I do not think that reading the news will tell me if these sorts of ideologies are more prevalent or are spreading faster than before, simply that they exist. I already knew that.

Why do you believe that there is some recent new acute problem?


I don't mean to restate the point, but how else do you arrive at these shifts in world democracies without these ideologies spreading faster or becoming more prevalent? The former happening is contingent on the latter...


You might be interested in the Global Terrorism Databse (GTD) [1]. I used it for data about terror, but I think it actually has a lot of research into very specific approaches for dismantling terrorist organisations. Not limited to deradicalisation (also contains research into how productive it is to "cut off the head of the snake") but definitely a trove of information.

[1] https://www.start.umd.edu/research


Thank you, https://www.start.umd.edu/radicalization-and-deradicalizatio... looks very much like the sort of thing it'll take me a while to digest...


> Has anyone been thinking of ways to set up deradicalising rabbit holes? I guess the basic problem is "the algorithm" would tend to push viewers away from those kinds of burrows...

Well, music on youtube used to converge toward mainstream. You started with punk or metal and three songs later it was playing easy pop.

I guess that repurposing the same algorithm could work.


What would we consider a "radical" opinion, though? That always mediates itself through politics. Do you think people will not be angry if YouTube leads people away from EVERY 'radical' political opinion?

You'd get accused of 'silencing' opinions. Dito any social network, especially Twitter. You'd be taking away a medium for people without the ability to get on old mass media networks, which invariably will be people with 'radical' opinions.


I'm not suggesting pushing people away, I'm suggesting attempting to set up an equally attractive (or even more attractive?) rabbit hole that tends to expose people to actual others, rather than just parroting second-hand opinions about those others.

(sorry to repeat myself, but I'd thought YouTube was exactly such a place. The other day I got some yemeni beats, which was certainly nothing I'd have been likely to search out on my own. Obviously —as with TikTok— different tastes can lead to narrowing as well as broadening, however.)


That sounds interesting. I'm not fond of people proclaiming that this desire for increasingly radical media is inherently human, so I'm open to the possibility of a different feedback loop.

Do you have any idea of what that would look like?


Not really, that's why I'm asking here.

"Increasing radicalisation" just sounds much more like former-Yugoslavia than former-USSR to me, and I think the latter became former in a much better manner than the former became former.

See https://news.ycombinator.com/item?id=24373042 for the best I've come up with so far. (the Atlantic article covers deradicalisation of actual trained terrorists, essentially by hooking them up)

Footnote 2, although from a fictional source, is pretty much the whole problem with the idea, from a human factors standpoint.


Breadtube is one such rabbithole.


I'm subscribed to nearly everyone in that pseudo-group, and I would consider them just as radicalised. If you peruse the comment section of 'How to Radicalise a Normie' by (I think) Innuendo Studios, you'll see a ton of comments talking about how these people used to be radical in the opposite way (e.g. alt-right).

But if you look in the comment section of an alt-right analogue of Innuendo Studios' video (something like "I used to be an SJW"), you'll see the exact same type of comment about how people used to be like the people in the Innuendo Studios comment section.

I've personally been on both sides at one time or another, and I reckon that the 'radical' component stays the same or even increases, even as the attribute of 'left-wing' or 'right-wing' may fluctuate, like a helix of some kind.


Given that YouTube's algorithms have been shown to push people toward content with more and more extreme views, I'm not sure why'd you'd think it would ever serve that purpose.

https://dl.acm.org/doi/abs/10.1145/3351095.3372879?download=...

It's been running as a defacto radicalization engine for years at this point.


Isn't there an argument to be made that YouTube has simply improved (perfected?) the algorithm news agencies have been trying to perfect for decades? We all know that sensationalism sells in media, so is the difference between sensational headlines and 'radicalising' YouTube recommendations really a qualitative difference (the latter of which is decidedly sinister and perverse), or just a case of machines outperforming humans?


> difference between sensational headlines and 'radicalising' YouTube recommendations really a qualitative difference (the latter of which is decidedly sinister and perverse), or just a case of machines outperforming humans?

No, the difference is that YouTube doesn't get a free pass; you don't hear “so and so was radicalized on Fox News (or OANN, or wherever)”, but that's not because it doesn't happen, instead it’ll get blamed, if anything, on groups and alternative media that the radicalized individual fell into after being radicalized by something shaped like traditional media.


I've seen it do both. I remember the days when it shoved you towards literal Nazis, and then years later I got in a friendship-ending fight with somebody who was far more pilled than I had imagined. Among the wild stuff he was trying to tell me, was the idea that YouTube pushed people towards moderates, and he had a study that was fairly recent.

If this study was anything like accurate, then AT THE TIME it was pushing people very gently towards more moderate YouTubers, in a subtle way. My nazi-adjacent former friend was really upset about this.

Thing is, both he and I remembered and acknowledged the time when it was feeding people extremist views really noticeably. His problem was essentially that it had stopped. I guess my problem is: who stopped it, why, and when are they going to switch back to doing that again?


If you have the study, I'd consider it, but second-hand anecdotes from the "nazi-adjacent" aren't really compelling evidence. Google did take some minor steps toward curbing things a year or two ago, but I don't know that anyone has done a comparative analysis of the current and former state of YouTube's recommendation system.


There is a really good podcast that talks about this from the New York Times called Rabbit Hole:

Podcast link: https://podcasts.apple.com/us/podcast/rabbit-hole/id15074239...

Article from NYT: https://www.nytimes.com/interactive/2019/06/08/technology/yo...


I kind of wonder how much of this is intrinsic to human behavior and how much is based on media cues from our formative years. It seems really easy: play the right music, say things ominously, flash pictures. The people producing these videos aren’t experts on human psychology. They know how to manipulate with film. And maybe it’s because we spend our lives watching media, and we learn that certain cues make something reliable.


A good video which goes over this is The PewDiePieline:

https://www.youtube.com/watch?v=pnmRYRRDbuw

This should set your YT recommendations on a path for more relevant content if your interest is sustained. There is a lot to unpack with respect to how radicalization happens on YouTube and elsewhere.


I find hope in the knowledge that even the media or corporate masters allow us to use can be used to radicalize people. The questions I have are about what tools we can learn from white supremacists.

All coverage of this is about maintaining the status quo. Where are the lessons learned about how to disrupt it?


Saying that youtube/conspriancy theories caused his to shoot people is the same as saying that metal music or video games caused this.

A mentaly ill person is a mentally ill person: it doesn't matter what triggers him because he is mentally ill and then they'll point their finger on the thing he read/listened instead of focusing on the fact that he was mentally ill.

No sane person will shoot up a school because of right-wing radicalization or video games.


Quoted further up:

"He said after being assessed by the psychologist and psychiatrist, there was no evidence the terrorist suffered from any mental disorders or psychiatric conditions, or that he showed any clinically significant cognitive impairment.

The psychologist determined he displayed a range of traits akin to a personality dysfunction, but not to such an extent as to constitute a personality disorder..."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: