Hacker News new | past | comments | ask | show | jobs | submit login
Cody'sLab channel suspended [video] (youtube.com)
88 points by cisanti on Nov 6, 2017 | hide | past | favorite | 57 comments



The world is a better place with CodysLab, youtube need to sort their shit out. He may have made a couple of slightly dodgy videos, but the vast majority are really good and sometimes very impressive science videos.


Which ones? I wasn't aware that Cody's Lab could receive strikes. The content is so tame, copyright free, and educational.


> I wasn't aware that Cody's Lab could receive strikes.

Oh it can. All it needs are a couple script kiddies with no life, a grudge or both - write bots that automate content reports and the antispam measures act on their own. Sometimes it's even possible to do manually, when you got enough users in your fanbase that invade a competitor...

It's a well known strategy not only in the Youtube "streamer/LP" cesspool, but also of alt-rights on Twitter and Facebook.


Probably things that deal with explosives. He may have had a video on making nitroglycerin.


Nothing on making it that I remember But several on how sensitive it is and how powerful.


He literally has a video titled "Making Nitroglycerine": https://www.youtube.com/watch?v=GMmYPAS9xB0

Not that I have any reason to think this has anything to do with the channel suspension.


My mistake, I hadn't seen that far back into his videos yet, he has way too many to watch them all. I was thinking of the recent ones where he was hitting nitroglycerine with knives and filming it at high speed.


And in the old days you just looked these things up in the encyclopedia...


A big problem with Youtube's three-strikes approach is that it's the same for all channels, big or small. It doesn't matter how much good content you've produced or how long you've been making it, if you get 3 strikes in a short period, your account is deleted, same as any channel.

So channels that produce a lot of content or upload videos quickly have to be more careful about what they upload. There's no credit for good behaviour or allowance for the extra risk channels take if they post lots of videos, potentially getting a strike with each one.

But little old me only uploads one or two videos a year. So I can afford to have every video copyright striked and I'll suffer no consequences since they expire in 6 months.

It seems unfair.


We go through this every time there's a copyright law or something. Baseball is not a good system for moderation.

At least the billion hours of content added you youtube constantly needs no nuance at all.


Looks like he did some kind of experiment related to microwaves and bugs. People reported and the channel got taken down.

Are people really that sensitive nowadays? And why they take the whole channel down, just disable the ads if the advertisers are the problem. This channel was really big, so I believe it gets plenty of attention.


To set some context for those that haven't seen the video in question, he placed a jar of fruit flies into a microwave and turned it on.

The flies were NOT harmed due to their size, and he went on to explain exactly why, discuss how microwaves function, EM wavelengths, standing waves, and so on.

However, anyone who looked at the title only might believe that he was harming the fruit flies, which is potentially why that video got so many reports.


He extended the experiment by showing that a grasshopper would be affected by the microwave’s wavelength; the bug exploded.

This was later replaced with a grape, which is also larger than the fruit flies which would be safe in the same box.


Yeah, it was the grasshopper experiment that did it. The video was unlisted and only shared with his Patreon subscribers but definitely not something he should have done if he wanted to stay on YouTube's good side (something he realized since he redid it with a grape and put that one up instead of the grasshopper).

I've been a subscriber of Cody's for years but he can be a bit reckless about things. Particularly about safety (just watch his recent video where he sliced through his finger with a shattered knife blade he slapped against nitroglycerin) but also about doing something like explode a grasshopper without too much forethought about how it'll be received. It's part of his charm but I can see it coming back to bite him again and again.

I love the channel though. It's one of my favorites. He never seems to run out of ideas for interesting experiments unlike a lot of other sciency channels.


Got a link to that exploding grasshopper exploding scene?


It's my understanding that when content is taken down or otherwise flagged one of the first consequences is that monetization is removed. So imagine if your code got struck down in code review so they took away your last two week paycheck.


I don't think he meant _that_ kind of bug, ha!


if my paycheck was as small as what most publishers get from single videos most of the time i wouldn't really care.


Sure, but it's not about what "most publishers" get.

It's about Cody staking his livelihood on youtube income. Typical income seems to be about $0.01 per view. His nitroglycerin video is worth almost $12k.


Regardless of people's sensitivity, I think the system functioned the way it is supposed to function, enough people report the content, it gets taken down, and apparently this isn't the channel's first strike. The only criticism I see fit is that they should add a manual check of the content if the channel is big enough, that is before it gets automatically taken down.


I disagree. Flagging a channel should disable it solely for the overly sensitive dimwitted user that voluntarily selected to view the video in the first place. YouTube is not a broadcast medium. Videos are available by voluntary positive action of the viewer only. Taking a video down, much less a whole channel, doesn't protect the viewer whose sensibilities were aroused. It only serves to forbid others from making the choice of whether or not to view the content themselves.


I agree that taking the whole channel down for the flag is not the right approach, and I would have thought the "three strikes rule", in any form, would have enough evidence against it by this point to kill it for good. But I think there's a bit of nuance missed by assuming that users are solely in charge of the videos they see. YouTube will autoplay related videos if you let it; I wouldn't exactly call that a positive action by the user. And recommended videos are a pretty large part of YouTube's model beyond that, so only removing the video for whoever flagged it doesn't seem like enough.

A better solution might be to unlist the video from these related lists and perhaps escalate things if a certain volume of flags are hit. I also don't see why reputation seems to have no bearing on this. How old and large is Cody's Lab's library, and how many strikes has it received (that were not later removed)? Seems like a good litmus test for false positives to me.


> It only serves to forbid others from making the choice of whether or not to view the content themselves.

No, there is certain content that is forbidden by law in some countries, for example Holocaust denial, Nazi propaganda and other racist or otherwise discriminatory content. I do wonder why Youtube and Twitter don't disallow that kind of stuff by default, there's no reason for giving this kind of content any platform even if it's legal by law in the US.

Also, fringe stuff like antivaxxers, "freemen/Reichsbürger movements", flat-earthers, chemtrail believers, other conspiracy theories or Russian fake news would be by far smaller if Youtube had decided that this content is not wanted on their platform. Instead Youtube willingly provides a platform for this content and therefore is directly responsible for the resulting problems.


And you want to be the judge who decides what is right and wrong?

Homosexuality is morally and legally wrong in many countries in the world, and sensitive topic, why not ban it by default because it's the joy of a fringe group of Americans?


> And you want to be the judge who decides what is right and wrong?

As soon as a minority is discriminated, no matter what minority it is (be it PoC, Latinos, Asians, gays, lesbians, transsexuals, or even social groups as poor people) it's wrong.

Dead easy definition, and for racial and gender based discrimination it's a worldwide agreed-upon international law anyway: racism => ICERD, https://de.wikipedia.org/wiki/Internationales_%C3%9Cbereinko..., gender-based discrimination => CEDAW, https://de.wikipedia.org/wiki/UN-Konvention_zur_Beseitigung_.... And for the other forms of discrimination, a judicial base may be extracted from the Universal Declaration of Human Rights.


Well that's clearly not believed by most of society. The young are discriminated against ferociously and denigrated at every turn, yet few would call it wrong. Few would argue that teenagers shouldn't be spied upon, controlled, and given fewer choices in their day to day life than felons in prison. It is 'for their own good' after all. As discrimination always has been, and always will be, couched.


You've mixed up two related topics: the right of a private entity to decide their ToS, and the right of the state to restrict information.

Disgusting as it may be, none of that content should be banned by the state. It places too much power in the hands of administrations which are frequently abhorrent.

The right of Youtube/Twitter/Facebook to forbid content on their own servers should be exactly the same as their right to burn copies of books that they own.


I would like to propose a law against the use of Germany's laws against Nazi symbology as an argument against freedom of speech. Given such a law, would you maintain your position, which would require HN to ban you for saying it?


If HN (or, for that matter, any platform!) would no longer welcome me for opposing Nazis, I'd be fine with it - so yeah I'd stick to my position.


I think you misunderstood his question. His question was whether you would support HN banning you if there were a law protecting free speech and making 'you may not post the swastika shape' and similar laws illegal.


>No, there is certain content that is forbidden by law in some countries, for example Holocaust denial, Nazi propaganda and other racist or otherwise discriminatory content. I do wonder why Youtube and Twitter don't disallow that kind of stuff by default, there's no reason for giving this kind of content any platform even if it's legal by law in the US.

I think you are confusing things. Simply because you desire forbidding others the choice of whether or not to view the content does not in any way mean that the act of suspending a channel doesn't accomplish this. The reason it is a horrendously dangerous, terrible, horrible idea to acede to a desire to hide 'dangerous' information would, I would hope, be easily seen by now, but it's obvious that this isn't the case. People seem to have forgotten that the Nazi newspaper was banned by the government of Germany after the Beer Hall Putsch. They seem to have forgotten that prior to that, the Nazi party was some guys hanging out in a pub being angry, and once the disliked establishment banned their paper it became the thing every German 'had to read'. They seem to have forgotten that it was the exact kind of censorship they seek to enact today that catapulted Hitler and his party into the forebrain of every downtrodden German.

The only legitimate and adequate response to a bad idea is a better idea. Full stop. You can not ban an idea. You can, however, drive it into hiding where it will see no opposition, no correction, no competing evidence. At that point, you guarantee for yourself the creation of an insular subculture who will draw the disaffected and opposition to the establishment. It is a Bad Idea. It has always been a Bad Idea. There has never been a happy state with heavy censorship.

Had we had such censorship in the 1960s, interracial marriage would still be illegal. Had we had it in the 1800s, slavery would still be legal. Every progress of mankind begins with spitting in the face of the established order. And that Nazi morons are clearly not a part of such progress is of no great import. You (generally, in the 'your society' sense of you) will be incapable of judging the next step of progress and this is guaranteed. In the United States, there is a legal concept called Prior Restraint on free speech and it is very aggressively avoided by the courts, as well it should be.

You are incorrect that various fringe groups would be smaller if YouTube became a bastion of censorship, and also that the platform owner is responsible for problems that you believe resulted from ideas being available. The existence of the fringe groups, their sizes, and the problems are all a result of widespread anti-intellectualism and a rejection of critical thinking and reason as an acceptable means of determining ones beliefs. Things like the anti-vaccine movement in particular grow far more through word-of-mouth than they do online resources. Zero of the anti-vaxxers went online to seek information about vaccines and found anti-vaccine ranting and then made a decision to not vaccinate their children. Instead, they heard about it from mothers who "just knew" that vaccines gave their kids autism. Or they listened to their president talking about how there are "just too many" with no basis. The idea that vaccines are harmful, that letting strangers, intellectual strangers who don't care, stick steel needles into their children and inject them full of chemicals and dead viruses was dangerous felt right and that is literally the only thing in the world that could convince them of any position.


Thanks for the long reply - you have some very valid points there, but also some that I don't believe to be correct.

> At that point, you guarantee for yourself the creation of an insular subculture who will draw the disaffected and opposition to the establishment.

Yes, and this is exactly what I want. Loads of Nazis and trolls flocked to Voat after the banning of r/fatpeoplehate and friends, and more after the recent banwave. As a result the amount of hatespeech on Reddit has massively declined and Voat is not a place that's relevant outside of Gamergate and Nazis. The casual Internet user won't ever see a tiny bit of it. Same is also true for Twitter / gab.ai, where lots of Nazis in Germany fled.

It's the same way in politics: in Germany, we have had the NPD (Nationaldemokratische Partei Deutschland), which was a really vile cesspool of literal and neo-Nazis and it was mainly left to fend for itself - and it kept most of the ultra-right-wing stuff out of mainstream politics. True, it changed with the rise of the AfD, but it worked for decades.

> Every progress of mankind begins with spitting in the face of the established order.

Point is, we already had the Nazis in power in history (Hitler of course, but also his friends Mussolini and Franco). They're not progress, and they cannot be.

> You are incorrect that various fringe groups would be smaller if YouTube became a bastion of censorship

As said above: look at what happened to fatpeoplehate and friends. Or into the history of our Reichsbürgers ("freemen" in US terms, IIRC): they rose with the ability to spread their "ideas" for free across the Internet, and when taken away from mainstream, they will shrink again.

> Instead, they heard about it from mothers who "just knew" that vaccines gave their kids autism.

And where do they find these mothers? On Youtube and on various Facebook groups, mostly (at least this is what a children's doctor told me, when I asked him about the problem, but well the "anecdata" principle does apply here).

> Zero of the anti-vaxxers went online to seek information about vaccines and found anti-vaccine ranting and then made a decision to not vaccinate their children

Nope. When I type into Google "vaccines", the third autocomplete suggestion is "vaccines cause autism". Not "do vaccines cause autism". When I type into Google "Ist Deutschland" (Is Germany ...), I get "ist deutschland eine gmbh" (is Germany a company), "ist deutschland souverän" / "ist deutschland ein souveräner staat" (is Germany a sovereign state) and "ist deutschland ein staat" (is germany a state). And for each of the Reichsbürger stuff I get 8/10 results being the Reichsbürger propaganda. That can and will lead people into believing this stuff.

Google keyword suggestions are dangerous, and they do influence society.


>As said above: look at what happened to fatpeoplehate and friends.

Right. Did they shrink? No. As I said, they grew. They are out of your sight, out of the sight of more reasonable people who might argue with them, who might give them a sense that their views are not accepted or correct. And they're getting more numerous because of this. And they're drawing people who don't even agree with their primary objectives but simply feel disaffected by the other places that will smack their hand and call them names just because they asked a question. They're so big that we get Pizzagate. They draw so many that we get Trump. For years they were just tiny subreddits you didn't have to go to unless you wanted to. People act like Reddit is some place where people get shunted into subreddits unawares. That is not how it works. You have to search or seek them out. And they remain as 'contained' there as they do at Voat. The difference is the barrier to entry of people who oppose them is lower.

>And where do they find these mothers? On Youtube and on various Facebook groups, mostly

No, we're not talking about tech savvy people. Facebook maybe, but more likely they see daytime talk shows or read celebrity news magazines or they actually talk to the other mothers in their kids day care and such. They do not trust things like 'online resources' and the more authority the site has, the more they distrust it. They are seeking to be emotionally swayed, not to research a topic. Intuition is their guide.

>Google keyword suggestions are dangerous, and they do influence society.

I don't know enough about how those suggestions even work to say much about them, honestly. Given how weird they often are, it can't be a popularity thing. (Or are they often gamed to insert weird searches?) I don't think many parents would actually be searching Google. They'll ask their coworkers, their friends, their family, but they're not going to ask Google. Then all they'll get is "garbage" from doctors and other disconnected ivory-tower reason fetishists like the ones who said thalidomide was safe.


This is really the problem that comes from Youtube's "Heroes" program and having users police each other's content. You remove choice from the equation, and as a result, content becomes watered-down and a lot of more interesting topics are suppressed.

I will say that granted, some content is definitely too grotesque or offensive that it should be left off the platform (e.g. beheadings, murder, porn, etc), however I suspect this kind of content is much more "black and white" and can be detected and removed with much less controversy.


> Videos are available by voluntary positive action of the viewer only.

Not true, you see suggested videos all the time.

What about really bad stuff? Like ISIS execution videos? Should they not be blocked?

If you think there exists some class of video that should be removed from YouTube, then it is just a matter of degree. You would still need some mechanism for people to flag a video and have it be removed for everyone.

If you think NOTHING should be removed from YouTube, including the most horrific execution videos, then you need to never do suggested videos, which takes a way a big feature from YouTube.


You are right about the autoplayed suggested videos, but ones that are solely suggested in the right-hand bar still require someone to say "I wanna see" and click on it. And, of course, you can trivially easily disable the autoplaying of next suggested videos.

As for newsworthy videos like ISIS executions, I don't see much trouble with them. They are factual, and help people to see why ISIS should be opposed. It makes it impossible for anyone to pop up that wishes to claim that their means are peaceable or compatible with peace or reason. Banning the videos puts those who would argue against ISIS in the position of having no evidence and asking those they are arguing with to trust them - a thing intellectually repugnant for anyone to do.

Given the way suggested videos works, the only people who would be recommended things like ISIS execution videos would be people who were interested in ISIS execution videos for one reason or another. It's not like they would be recommended to me who mostly watches tech talks and science stuff.

And of course there are certain things which should be removed from YouTube. Not because they are "offensive" (which is a free choice on the part of the audience, not the creator of the video who has no control over how their unknown future audience will have trained themselves or been trained to respond) but because they are harmful. Things like blatant copyright infringement, child pornography, revenge porn, etc are illegal not because they are gross or shocking or upsetting but because they are harmful. They cause harm.


So ISIS execution videos don't cause harm, but child porn does? I mean, I think it seems clear that they both cause harm in pretty much the same ways (they give an audience to horrific acts and can encourage the acts themselves).


To be honest, I remember a lot of that kind of videos and you has to be logged in to prove your age. Maybe I searched for that kind of stuff back then, it was as allowed, or they were not actively hunting down that content.


WAI is not a valid response.

So any well coordinated group has the right to censor based on their group size? 4chan, white supremacists, Greenpeace, black lives matter can all trade removing a voice from the internet.

No.


Maybe having a gradual response: video flagged, unlist video not the whole account. It seems he had a previous issue, but overall he was more good than bad.


You can't get taken down for a single video, so people were reporting other videos.


The problem with large platforms is that they are open to abuse you see the same problems at Facebook, Twitter, YouTube, etc, it's basically an infinite problem.

All of the "machine learning" algorithms aren't really that intelligent, because they have no real human context, so then you augment that with a trust and safety team to do manual review.

When you couple that with a platform that has millions, or billions of users, you simply do not have enough trained people doing the review process so legitimate users, even with large followings, sometimes get nabbed.

Just like legitimate people with less subscribers can be caught in the same web.

Then you are basically at the mercy of the queue and the number of actual people they have doing the manual review work, which in YouTube's case I'm guessing is rather massive.


The problem is not one of intelligence, but of consistency. Even human operators would struggle with this sort of stuff. Though at least with them, they have empathy and personal-bias which just makes the outcomes swing one way or the other even if they don't makes sense.

AI struggles with this problem because it's a vague and poorly-defined one. That's why you have to take the only way out, either ban it all, or allow it all. For the latter, I'd echo what someone else in this discussion mentioned, let them self-censor what they don't like. Rather than trying to come up with one size fits all classification rule that really doesn't cater for anyone really.

I would liken it to a multi-dimensional classification problem. And each person has their own classification. We're essentially trying to pigeon-hole a single delineation between acceptable/non-acceptable topics onto an entire population. It's borderline stupid behavior, and I have no idea why anyone thinks it'll please everyone all the time.


Google really needs to have a human contact that can be reachable for channels with a large amount of subscribers. Advertisers do provide Google with nearly all of their revenue, but then again so do content providers as Google provides next to no content of their own. The fact that we have large channels being arbitrarily taken down with no warning shows how obtuse Google is.


Or maybe content creators need to get away from Google.


It's still possible. Visit China. Both sides need to learn from each other to move forward.

Blind faith in either fully open or fully controlled networks is the sign of unimaginative robotic mindsets that waste everyone's time.


Blind faith in any paternalist system which makes judgements for us about which information is appropriate for us results in western democracy, whose death throes due to information poverty we are now seeing.


Suggestions for YouTube I've seen here and elsewhere I like:

- flagging a video suspends a video, not the channel

- flagging a video hides the video (or channel) from the user who flags (this is one of my favorite features on Reddit, allowing me to distance myself from a few subreddits I prefer not to see)

I do not like:

- a few flags to a single video takes down the entire channel. It feels disproportionate to me.


It's not just disproportionate, it's going to push people away from the platform.

YouTube isn't too big to fail. If you push out the people who are creating the content your business model relies on, you'll fail. YouTube isn't too big to fail.


Managing important infrastructure needs to be independent of exploitation, management, and/or regulation of content. You are either involved in editorial decisions or not. If you deliberately stay out of content decisions, and act like a common carrier, your liability should be limited and you get safe harbor protection. Editorializing and acting like a manager brings responsibility, - and liability.

Yoytube (Goole/Alphabet) needs to decide which level of involvement they want to have? Do they want to stay out these decision and close accounts only with a valet court order? Or are they a manager with greater power... that might be liable for problems casused by their negligence.


Does Google not care to fix this issue, or do they fully understand the consequences to content creators and allow it to persist anyways? I.E. are they so beholden to advertisers that it has to be this way?


> Does Google not care to fix this issue

Why would they care? YouTube is the only game in town for content like this. Where else is this channel, or a new one looking for a platform, gonna go?


Vid.me is attempting to become this, although they are very far behind.

The problem is that basic large-scale issues (business not technical) become hard or impossible to handle without serious investment and cashflow.


Does google care since there are no alternatives?


from his video description "I will be moving videos over time to other sites and to here". I think this content duplication/redundancy (with likely vimeo and I'm not sure what else) is going to be the future of things to come. I wonder what sort of legal challenges may come to the fore if such battles ensue.


Par for the course. Youtube has gone completely insane and it is leaving a huge opening for less completely insane competitor platforms.


Youtube are just really, really desperate to kill Youtube by any means necessary these days.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: