> This was the first sign that YouTube’s algorithm systemically directs users toward extremist content. A more neutral algorithm would most likely produce a few distinct clusters of videos — one of mainstream news coverage, another of conspiracy theories, another of extremist groups. Those who began in one cluster would tend to stay there.
> Instead, the YouTube recommendations bunched them all together, sending users through a vast, closed system composed heavily of misinformation and hate.
Wouldn't such a "neutral" algorithm lead to the criticism that people who start watching a video on a conspiracy theory only get recommendations for other conspiracy theories, rather then mainstream news?
And how would such an algorithm be neutral? How would an algorithm define conspiracy theory? In truly neutral terms what this author demands is tighter clustering, that youtube keeps people in their bubbles, which seems like a horrible idea.
> Instead, the YouTube recommendations bunched them all together, sending users through a vast, closed system composed heavily of misinformation and hate.
Wouldn't such a "neutral" algorithm lead to the criticism that people who start watching a video on a conspiracy theory only get recommendations for other conspiracy theories, rather then mainstream news?
And how would such an algorithm be neutral? How would an algorithm define conspiracy theory? In truly neutral terms what this author demands is tighter clustering, that youtube keeps people in their bubbles, which seems like a horrible idea.