The problem with online opinions is that while the "opinions are like assholes, everyone's got one" adage is true, online nobody is there to tell you "stop watching this stupid thing". And we all know YT comments are a heap of garbage. Also YT/Instagram comments are fully moderated by the author, so removing all disagreements is easy. Easy to give the appearance of consensus.
And worse, once you watch one, YT's algos want to keep you watching so suddenly you are flooded with related vides.
This leads to a problem.
I have countless examples like this. YT keeps pushing me down a specific ideological path without any action on my part. Sometimes it's great but in those cases where it's not it's not just bad but outright dangerous. I don't think YT has the effect on my since I'm way past a point (age) where my behavior can easily be formed. But I can see how this sends somebody over the cliff who is still figuring out who they are.
People like rockstar2016 (aka the MAGA killer) or the woman who went into Youtube HQ and starting shooting the place up were probably a good examples of how social media can radicalize those that are the most vulnerable.
EDIT: “All but one of 30 Flat Earthers interviewed said they hadn't considered the Earth to be flat until watching videos promoting the theory on YouTube” https://www.engadget.com/2019/02/18/researchers-blame-youtub...
My favourite one right now is "5G is bad and will kill us all and microwave birds". There is not a single article on 5G without the Facebook comments being full of people shouting about the extreme dangers (even when it's just about enabling the fake 5G on frequencies previously used for LTE). No matter what you say, they will keep linking you vague YouTube videos by people who claim to be doctors and misinterpreted documents from the 80s. I don't get it at all.
Off-topic: David Dees has some funny (to me hilarious) artwork on his site that describes these conspiracy theories better than words: http://ddees.com/new-art/
The ones about "blockchain" and 5G are so friggin good I've set them as desktop wallpapers :)
From the Russian collusion narrative, to the Avenatti client claims, to the Covington school coverage, to the recent Jesse Smolett coverage. Western media increasingly does not bother to check facts before running with stories.
It's fair to blame the "msm lies movement", but that movement itself has a root cause.
The premise of this article is that these videos are going to brainwash/hypnotize poor fools into believing crazy things.
Newspapers need to win not be being the only voice in the room, but by being the most relevant, clear, researched, reliable, unbiased voice in the room.
And social media giving these people what they want creates a vicious circle that does real damage to society as a whole.
A platitude like "free exchange of ideas" does not address this problem.
You seem to be under the impression that the truth will win out based simply on the virtue of being true, but that is simply not how humans work. Disinformation spreads because it can be crafted specifically to take advantage of the flaws in how humans think and decide truth. We can't just bury our head in the sand and think this is not a problem.
I am not, in any way, arguing for censorship of false information. However, I do think we need to be aware of the threat disinformation plays, and work on strategies to help inoculate our population against it. It is a false dichotomy to think our only two choices are 'ignore disinformation' or 'censor it'. We can do other things.
It is dangerous to ignore it, or to act like it isn't a threat . While I do believe that the overall arc of history is towards more understanding and truth, it is foolhardy to assume this path just happens naturally. It happens because people pay attention to threats against the truth and respond in ways to counteract it.
It used to be that while we had free speech, publishing companies had standards of respectability. Newspapers weren't going to print smut even if it would sell more papers for example.
The problem with YouTube is that it's basically a newspaper editor with no ethics, it only cares about selling papers/maximizing on screen time. My son has recently been into watching videos of the moon landings/the lunar rover. If I leave YouTube running, the recommender will automatically start playing moon hoax conspiracy videos. What's worse is that it uses this same asinine recommender algorithm that tries to get me to watch as much YouTube as possible even though I pay them $18 a month. Given that they already have my money and aren't showing me ads, I would appreciate it if the recommender had my best interests at heart rather than just showing me videos that an algorithm thinks will addict me.
But being relevant, clear, researched, reliable and unbiased doesn't get as much attention as feeding people's biases and fears and all the rest of that. Is it possible to win if all you have to offer is relevance, clarity, research, reliability and lack of bias?
Today it's increasingly difficult to point to any particularly stable source of information. The media has become ubiquitously driven by partisanship, agenda, and sensationalism. And reporting of falsifiable facts has been replaced with opinion, conjecture, speculation, and leading statements. And one of the biggest issues is that this is all happening during the era of the internet. In times past if you wanted to read older news articles you needed to go to a library and hope they had the paper scanned onto film, and then dig through the paper to try to find whatever you looking for. Now when a paper engages in shoddy reporting, it can be instantly referenced and used as an example to all.
The lowest common denominator always wins the short game in society. Offer somebody a sensational[ized] and poorly researched story, or a well researched, considered, and frankly somewhat boring story. The sensationalized story will almost always win in a one-off. It's fun to read light weight trash when you're just going for a taste. Who didn't occasionally flick through that National Enquirer at the checkout stand? It was fun! But in the long game, the trash loses. It may be fun to read a trashy story here and there, but it's not fun when that's all you have because it's ultimately meaningless.
The big problem is that I think the media was very late to react to the internet, and then overreacted by trying to compete with social media. And that inevitably meant becoming trash. I'm going to conclude with a very high hanging target, The Intercept. In my opinion The Intercept remains arguably the most reputable media outlet today. Nonetheless, especially in recent times have also started to go down the path of partisanship and it always ends the same way.
 - This is an archive of an article The Intercept ran some time back suggesting it was absurd and unimaginable for the president to suggest that some bomb threats against Jewish community centers might be driven by something other than anti-semitism. The outrage driven article focused on his suggestion that attacks can, in lieu of anti-semitism, sometimes by done because "Someone’s doing it to make others look bad." The paper suggested such thoughts were "white supremacist conspiracy theories."
 - This is an article The Intercept ran two days later after it was revealed that one of their former reporters had been the person executing the bomb threats. He was an extremely liberal black individual who reported almost exclusively on social justice and racial issues. He had been calling in the hoaxes in a convoluted attempt to frame his ex-girlfriend (whom had recently broken up with him) for calling in the bomb threats.
 - This is how the article was updated given #2.
 - https://archive.fo/DwZLx
 - https://theintercept.com/2017/03/03/statement-on-the-arrest-...
 - https://theintercept.com/2017/02/28/trump-suggests-anti-semi...
What we know already is that powerful organizations have secrets that they spend big money on protecting. The difference now is that with the internet the time for which a secret can be kept shortened significantly.
Yes, we have a long way to go and we should continue to seek improvement, but don't try to tell me that things are getting worse when that's clearly not the case. Or at least provide some evidence that's not anecdotal.
That said filter bubbles and ai feeding us more what we “want” is probably increasing conspiracy theories and making us less tolerant of different views. There has been little talk of the morality of ai algorithms.
Personally I was recommended “hot” dating sites advertisement when having relationship issues. Not what I needed but surely what most likely pays the best. The issue is lack of ai moral or for that sake profit first. Obectivivly the ai was right but morally not I am a parent.
Ads probably should be regulated at least you should be able to say no to ai algorithm and behavioral psychologist maximizing profit for private companies.
One underappreciated aspect of the rise of conspiracy theories in the last 20 years is the fact that more people are insecure financially and have lost trust in institutions, often justifiably so. So conspiracy theories meet less resistance than they would have from people who are not worried about their survival in the face of a society that seems determined to immiserate and impoverish them if not destroy them.
Additionally we have a number of examples of institutions being subverted to malign ends, from Wells Fargo to Wikileaks, we consistently see dishonest behavior from organizations that claim to have the publics interest at heart...
Not that anyone who uses the techniques are necessarily pro-nazi, but that it's a technique with an established history of [attempted] use, IIRC, in the popularization of presently unpopular ideas.
YouTube and Google have put out tons of comms lately indicating that it's at least something top of mind
Or it's the usual default of labelling as such anything one doesn't agree with?
Second, while I agree that YouTube shouldn't ban people for believing in conspiratorial nonsense, even if I do think it's harmful, it's not unreasonable that their algorithm should bias things that are fact-based, or at least not contributing to a toxic platform. It's hard to argue that any good came from Alex Jones posting the addresses of the victims of the Sandy Hook shooting, and it doesn't create a fascist dystopia when YouTube doesn't want to promote something like that.
Also, I wouldn't be 100% sure that all the flat-earthers are trolling. There's an overwhelming amount of stupid people in the world, and while I will say that most of them are probably messing around, at least some are serious. At some level, I think it's far too easy for people to say "I'm just trolling" after the fact.
>Congress shall make no law [...] abridging the freedom of speech, or of the press; [...]
That said, if I were hosting my own video-sharing website, there's no way in hell I would recommend a white-supremacist video, or even a flat-earth video. If people find it on their own, that's on them, but for the same reason that I personally wouldn't pass along flyers for thedailystormer, I wouldn't promote someone like Alex Jones.
Oh, we worked on a project computing link budgets for geo stationary communication satellites. So the company made all of their money off of satellites, and he could completely disconnect from that.
I believe he believes it. But it's impossible for me to know.
I make no assertion other than that.
Isn't that the same stuff conspiracy theorists always say when someone points out their theory is BS?
Unfortunately we see a lot of conspiracy theories have very real and damaging effects on people, often not those indulging in it. Just look at the idiot that attempted to burn down that pizza place because of pizzagate or how many people harassing families of school shooting victims because they believe they're all government orchestrated tragedies.
It's not hard to see why recommending false and/or misleading information is dangerous, and you'd have to be purposefully ignorant to not see that.
I don't think that the NY Times or the YouTube executives would ever advocate that you arrest people for spreading this misinformation, short of direct harassment, but they don't want to be party to the purposeful dumbening of society.
Google is a massive multinational corporation that profits by harvesting and exploiting personal individual on people. They have shown an eagerness to expand their operations to countries such as China. They planned to launch a censored search engine and were happy to agree to record and track individuals by tying their search to their phone number. Such behavior would enable convenient tracking and 'correction' by Chinese officials if they so desired. Among the list of terms they included in their prototype for China was literally "human rights." . A company literally censoring information on "human rights" is like something out of bad dystopia fiction. This should not be reality - but it is.
I'm sure you see the point I'm making. By claiming that the average person is too stupid to be allowed to access whatever information they would like, you are implicitly asking somebody to be the gatekeeper of truth who decides true vs false, right vs wrong. But nobody is capable, let alone deserving, of this right. And the most ironic thing of all is that the typical gate keepers of truth that people would nominate are some of the worst actors in society that certainly have caused unimaginably more harm than all the total sum consequences of all the absurdity spread by individuals. And that's because when individuals spread fake stuff around, they might fool a few people but it mostly goes without notice. By contrast when the current gatekeepers spread misinformation, hundreds of thousands of people die at cost of literally trillions of dollars.
 - https://www.nytimes.com/2004/05/26/world/from-the-editors-th...
 - https://theintercept.com/2018/12/01/google-china-censorship-...
This seems to be being mentioned more and more these days, too. It's as if it's the only thing people learned from the war, and they blame the media for it.
That said, and as I've kept repeating, YouTube and Google are not government entities, and they aren't required or even given incentive to platform horrible people, or people that they view as horrible. While I agree that it's a bit disturbing that Google is releasing a censored search in China, I don't live in China, and I was talking largely in regards to the United States, (since that was where the whole freedom of speech thing came up).
We draw the line all the time. If someone was in my house and started spouting off Neo-Nazi propaganda, I would tell them to leave, and I would be completely unimpressed with their argument for freedom of speech, as I think you would as well. Am I an evil totalitarian dictator because I don't want to give an audience to people I think are disgusting? Am I anti-free-speech because I'm denying the other members (especially children) of my house the ability to hear opposing viewpoints? Of course not; it's my property and I don't want disgusting people in there.
I would definitely prefer to keep open discourse, but my point is that I don't see how it comes down to the evil dystopian world that your comment indicates because YouTube doesn't want to recommend stuff that they view as dangerous.
YouTube is a natural monopoly which changes the whole picture. It even works as a bypass for literal first amendment infringement from the government. Imagine a government entity wanted to prohibit discussion of a given topic. In past times, their only option would be to try to pass legislation against it. That's where the first amendment kicks in. In modern digital times, however, there's another option. They can simply apply pressure or offer incentives to e.g. Google and Facebook to ensure it ends up on their black lists. It's a clear violation of the spirit of the constitution without clearly violating the constitution. None of these issues came up when considering the constitution as the concept of things such as a private company having a monopoly on public discourse would be completely nonsensical.
I think it's completely unavoidable that the next socioeconomic movement of society will be to an overt corporatocracy. That's disappointing, but it is what it is. The only thing I wish is that people would realize is that these steps are exactly how we get there. This all effectively comes down to not only simply accepting a monopoly of this scale, but now further suggesting that this monopoly begin ensuring that the discourse is 'corporate approved'. I'm certain YouTube will be thrilled to comply.
Sure, I have an issue with YouTube being a near-monopoly too, and if the discussion came down to "YouTube is deleting videos for ideological reasons", I think your point would stand.
In this case, however, the issue came down to their recommendation engine. Even if you did view YouTube as a government entity, which I do not, I don't think it says anywhere in the constitution that the government has to give recommendation to every side of the argument, just that you're allowed to say it.
All that being said, I actually have been working on and off on a clone of YouTube using distributed hash tables, so maybe we'll be off of it soon enough :)
Efforts at competition should not be neglected, but I think it will likely prove futile. There are already plenty of alternatives to YouTube, but that doesn't matter with a natural monopoly. Content producers want to go where viewers are. Viewers want to go where content producers are. Whoever becomes the 'big one' first, wins.
There's a fundamental problem. That is that content intended to be free by users is something that private companies then claim effective ownership of as a condition of being able to say anything. This is an interesting 'trick'. I call it a trick because let's say the average person posts something to e.g. Facebook or YouTube. Would they mind if another site, with attribution, also shared their content? In the vast majority of cases, the answer would be no. Most people are just posting things for enjoyment or to express themselves, they'd love if it got shared as much as possible. But other sites cannot share these users' content because e.g. YouTube or Facebook claim and defend exclusive ownership of what is posted on their site. You'd need to get a user's express permission to share their content, and that's not really viable.
Imagine for a second that we killed this trick. Companies that provide user generated content for free, or with a free account, could only publish content under non-free licenses if the content creator specifically opted in to that agreement. However, even if they chose to not opt-in the company would still be obligated to publish and treat their content identically to how they would have if the user had opted in. This would all go away if the company charged even $0.01 for access. The company could also incentivize users to opt-in, such as by paying them up front for their content.
The idea is to turn "free" into simply free. This would enable real competition since free access means somebody could simply start copying content created by users who wanted to post free content on e.g. YouTube or Facebook and share it in a different venue. The exact same would be true of comments and other such user generated content that was always intended to be free, and not "free", to begin with.
But so long as a monopoly is able to claim effective ownership of material users meant to be free, this system will likely only grow larger.
You're underestimating the stupidity of a vast portion of the population. There are plenty of people who believe not only that the earth is flat but many other conspiracy theories as well. Dismissing it as trolling is naive.
If you don't agree, why are you pro-censorship?
I think your narrative is pretty far off the ball here.
It still doesn't answer the question posed.
They should be required to host it / anything?
Yeah, but that's true.
Totally plausible, and in some cases (certain malicious apps etc, on Android or jailbroken iPhone) totally true.
At some point aren't they responsible for hosting it?
If I were hosting it, I'd feel somewhat responsible.
This stuff starts with Authority. on Live TV. Not some rando youtube channels.
I have no idea what you mean by "More guilty".