Hacker News new | past | comments | ask | show | jobs | submit login
Something is wrong with children's videos on the internet (medium.com/jamesbridle)
336 points by IBM on Nov 6, 2017 | hide | past | favorite | 112 comments



I would say they the author dragged the readers into the dark depths of YouTube, but the billions of views on these videos belay that metaphor.

The ending is as pointed as it can be. It's hard to define what the solution is and how to go about dividing up responsibility.

Technology companies create these relatively neutral platforms which then grow and are gamed. In this case these videos are vying for mass attention from children which is subsequently monetized. They optimize, tweak, and mass produce their only paying regard to amount of attention they can secure. Taste, morals, exploitation of children, and everything else are meaningless so long as their videos receive an adequate number of views.

They did a good job of extrapolating this issue to other problem areas such a radical left/right videos or conspiracy videos. Here is an example of this issue in the form of Google results from yesterdays mass-shooting https://twitter.com/justinhendrix/status/927335154707828736

I think the lion's share of responsibility lies with the technology companies and governments. I'm hesitant to have government involved in their inability to keep pace or understand new and developing technologies. It's also hard to define how to solve this problem without censoring speech or disenfranchising it. It's hard for me to define what is the absolute issue and what to call it.

A "seemingly neutral platform" can become corrupted or systematically abused. You constantly need to account for bad actors and gray actors.


Re: billions of views:

"Once again, the view numbers of these videos must be taken under serious advisement. A huge number of these videos are essentially created by bots and viewed by bots, and even commented on by bots. "


That is an important caveat and it's hard to accurately pin down since there is no idea of how many of these views are generated by users or by bots.

For the real view number of billions to be wrong bots need to account for over 90% of the views if we're only considering the two channels the article referenced.

That bot percentage would have to be much higher if you factor in other channels of similar veins. https://socialblade.com/youtube/top/500/mostviewed

Though you also have to factor in how many channels are just run of the mill content egg opening content vs more disturbing children's entertainment.


Where are you getting 90% from? There are only a little over 7 billion people on the planet and a little less than half of those have internet connection last I saw. If something got billions of view either the entire internet population watched it, or a smaller group watched it many many times. Sure some things are viral, but what are the odds that they have near 100% penetration or that everyone who sees it just plays it on repeat?


My two year old daughter has watched "Let It Go" 12 times so far today and it's only a little after noon. I could very much see the human target audience of this content watching it on repeat.


Little Baby Bum 13 billion + Blu Toys Surprise Brinquedos & Juegos 6 billion views = 19 billion views. If 90% of those views are from bots that still leaves about 2 billion views from actual people. Not saying there are billions of people watching these videos. I'm saying that even if the vast majority of the views were from bots that still leaves billions of actual views. I would agree lots of folks would play the videos on repeat. The actual number of folks/children in this sort of video matrix would be even trickier to determine.


When I worked in newspaper web tech, I did an analysis of our front end traffic, grouped by IP ranges, user agent claims, and so on. I found that the majority of our bandwidth was spiders and bots. The majority of our hits were also robots. I couldn't do anything about that without big spending with our cache vendor to prevent it.


I have seen them pop up on YouTube and it was the last time I ever let my children use it.


Honestly — I think there is an easy solution. Require platform vendors use “trusted human” curation for videos shown to children below a certain age.

To avoid having to make a single decision about who constitutes a “trusted human” — democratize the curation process to allow arbitrary numbers of groups of humans to exist where each group can supply only one “kid approved yes/no” recommendation per video.

Then deal with the proliferation and ranking of moderation groups by making each parent manually select the set of agencies which are allowed to mark a video as child viewable. Audit the practices of the most popular agencies to ensure they consist of humans making decisions based on some human notion of child interest.

If there are insufficient numbers of moderation groups, threaten regulation or introduce regulation and provide funding for moderation non profit organizations.

This kind of setup would severely fragment the market for gaming kid psychology and likely ensure insufficient profit motive for it to continue at a meaningful scale.


I think the responsibility should lie with whose ever is making money off it.


The solution is simple - remove the public view counts. People (producers and consumers) are unnecessarily over influenced by it. There is a huge global population of semi literate people on the net today that have no concept of how to parse all the information they are seeing. The numbers are their guide posts. They learn quickly how to pander because the waste majority of content is pandering to the lowest common denominator.


That's not a solution as long as "popular" videos still get higher ranking in searches.


My kids are on YouTube Kids all the time, and this is terrifying:

> "Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level"

Something about a lot of the videos I have watched along with my kid has always felt "off" to me. It makes 100% sense that a lot of the content on certain channels might be produced algorithmically, and uploaded en masse.


Last week my girls were watching some obviously child-focused YouTube videos, and the YouTube app showed a commercial for the new season of Walking Dead.

Seriously, an ad showing graphic scenes of zombies or other monsters on a video clearly targeted for small children.

I don't even know how to go about complaining in a way that won't be piped to dev/null.


Write to a senator who gets large donations from classic media companies like Time Warner, and ask them to apply the same advertising standards that the FCC uses to the web.


Besides generally controlling screen time, I don't let my kid have free reign in the YT app or website itself. This is one case where a walled garden is actually a good thing.

I always vet them first, then download the videos and play them off our local NAS box or the local device (that way, no ads).


Don't let them watch one minute of it, it's all trash.


YouTube has tons of great videos that are awesome for watching with children. My 8-year-old loves the PBS Digital Studios series like Eons and It's Okay to Be Smart, and Crash Course.

Even I like some of the better nursery rhyme videos in foreign languages, for my target language. I hadn't understood that even the good channels repackage their materials into different compilations for the ad views. But for us YouTube is something that is on the TV in the living room and we can all see and agree on the videos we watch. Still it's hard to filter out the filthy language, weirdness, and pointless cruelty.

The problem the author identified exists in other parts of the kid Internet. With Web-based games, sites are aggregating weird and disturbing games with the OK or just junky ones, and kids are sharing these sites with each other.


What part is trauma/abuse? I don't get it.


Practically speaking, all of it. The prolonged effects of handing an iDevice are damaging to their psyche. Kids aren't in a position to exercise rational decisions on what is and isn't acceptable content. The Youtube for Kids content filtering isn't nearly as advanced as it should be, so parents end up spending an inconsiderable amount of time attempting to filter to no success. These videos, along with those "Daddy finger" songs, adults unwrapping toys and "Ryan's toy review" where the little brat gets all the toys and destroys them are mind-numbingly pointless and damaging (considering the lack of value). At some point, parents need to consider the unknown factors and the possibility of (incidental) trauma.

At least, that's the conclusion I've come to after watching my four-year-old consume some of the above. Counter to that, the reduction of screen time has turned her more empathetic toward her sibling, though I can only state that qualitatively.


I still don't see what actual harm is being done here.

I agree that screen/device time is generally bad, but this goes whether it happens to be YTKids, Netflix or games.

What specifically is "wrong with the internet"? I still don't see the issue here.


Did you watch the video where a series of marvel heroes were captured and buried up to their necks in sand? These videos are nightmare fuel for children.


The violence in the superhero video I agree (esp. with the slapstick music) is a bit disturbing. Same with the gorilla daddy finger.

So what is the immediate solution for parents? Block/uninstall YTKids?


Yes. Block it. Turn it off. Try handing them crayons and paper. YTKids is a product begging for your time and attention and that of your kids. It sucks and beyond that may be damaging so stop buying it (with your kid’s attention). Turn it off.


Never let your kids browse the open internet without supervision. Create curated playlists that they can watch when you aren't able to devote 100% of your time to their supervision but still want to occupy their time with videos


Block. If not, then filter the content though I can tell you their content blocking algo is terrible at blocking new content so you end up in a game of cat & mouse.


In another comment someone said that YTKids has an option to disable related videos, that could be a good start.


Yup. Get video apps from actual companies that have at least a semblance of QC.

PBS kids, Disney Jr, Nick Jr.


I think the author's implication is the 'wrongness' is the development of an ecosystem which encourages the creation of sadistic content marketed to children.


Did you read the article and watch the example videos?


I found this to be a tediously drawn out read - the New York Times article is much better and to the point (without continuously warning me about reading further).

https://www.nytimes.com/2017/11/04/business/media/youtube-ki....


I think the medium article has a different and interesting take and conclusion about what the real issues may be. I agree it is quite long.


I have a 2 year old daughter, so I can relate to how enthralling those baby bum videos can be to a 2 year old.

But, Human oversite is not "impossible",it's called "parenting" and we should try it some time. We removed the TV from the living room and put it in the guest room, now sits a beautiful painting in it's place in the main part of the living room. Our kid doesn't use our iphone and ipad, it's as simple as that. If she cries for it, then she cries for it - no big deal. Ever since we removed the TV, she doesn't even ask to see videos in the living room anymore. You just gotta set limits.

Is, responsibility for the self, such an arcane concept. If you choose to get ALL your news from one site: whether it be Facebook or Youtube, then of course there's a chance your not going to get the whole truth. It should be all of our responsibility to seek out better sources of news. Install a plugin? or find a better search engine or get a better App, or go to something reputable like NPR, new york times, the Guardian. Lots of alternatives out there, as long as we're not too lazy to seek them out. There's plenty of ways to diversify your news sources and verify the things you read and watch. There's got to be a plugin out there that blocks news from FB if that's what you want.


Part of the point of the article is that these systems have made the job of the parent (or the individual) much more difficult than it was in the past. Cutting the internet and TV out all tougher is certainly one way to prevent your child from being exposed to some of these algorithmic horrors, but shouldn't there be a way for parents to easily say "don't show horrific things to my child" to youtube? As the article points out, that's what built brands like Disney. They built a reputation on "safe for kids" and youtube (and twitter and facebook) have made it harder for those kinds of reputations to be built or to matter. On there internet, everyone is a dog.

What so much of this seems to boil down to is spam. It is kind of incredible that spam was shut down so effectively and completely in the world of email, but has persisted, much to everyone's detriment, on facebook/youtube/twitter/etc. The only explanation that makes sense to me is that spam fighting was a differentiator for different email clients, but facebook et al. are actually incentivized to have spam count as impressions.


Yes. Peppa Pig looks OK. The problem is if YouTube recommends or continues on to a video where a machine splices Peppa Pig with parody videos and gross-out videos and genuinely weird and disturbing stuff.

The problem isn't merely that that this stuff exists, but that it YouTube is recommending and autoplaying it.


This solves the problem for your family, but doesn't solve it at large. There will still be parents who let their kids watch this. There will still be impressionable youths watching nazi propaganda. There will still be gullible people watching "The earth is flat" videos.

Yes, this is a problem that you can solve for yourself by moving to a cabin in the woods and hunting and skinning deer, but that's just a bandaid.


But, even without Youtube, you still need to supervise their access to the internet. But, AI can help in this regard too. Just install some parental control plugin that only allows educational content to come onto the device. Set it up with a timer, so their not on the device all day.


> But, AI can help in this regard too. Just install some parental control plugin that only allows educational content to come onto the device.

That's what Youtube Kids purports to be. You're just suggesting sending in more trains: https://www.youtube.com/watch?v=v5JiPj9c98Y


Isn’t turning to AI what enables this to happen in the first place? My understanding is that is exactly what youtube kids IS. Unless you mean more accurate technology, in which case you should say that.


People keep bringing up nazi propaganda as if it's a serious threat, but is it? The most successful radicalization stories I've heard are about people being recruited into ISIS. The political echochambers that have the most reach are the censors at Google and the campuses they graduated from, stemming from the cult-like application of progressivism. The most pressing issue of science illiteracy is climate change denial, pushed by capitalist interests.

People are worried about the one ideology we are most inoculated against, and for which tens of thousands can be whipped into a frenzy to protest against, _even if no actual nazi event is taking place_. This is not a real issue. Neither is being upset that you're letting your kids be babysat by bots who make Sims machinima. If the author feels there's a flood of dangerous content out there, why is the article full of milquetoast embeds?


This may not be Nazi propaganda, but it's certainly far-right propaganda:

https://twitter.com/justinhendrix/status/927335154707828736

Where are all these cult-like progressive echo chambers at Google when you need them?


Climate denial videos would have been a much better example for me to use, rather than flat-earth videos.


So, let them watch it? You and the GP are talking about different sets of people: A 2-year-old kid probably has no frame of reference to decide what to watch and what to avoid, so you step in - as the parent - and decide for them.

In all other cases, the solution lies in what the GP mentioned: Personal responsibility.

Sure, it's not nice to have teenagers get radicalized by Nazi propaganda, but I'd argue that this is not a failure of the content they have access to, but rather hints at serious issues in their life outside of Youtube. Same with flat-earthers (btw, serious question: What's the problem with people believing the world is flat exactly? You see them for the fool they are and move on, what's the big deal?): If someone chooses to educate themselves on that sort of conspiracy-theory, let them.


> What's the problem with people believing the world is flat exactly? You see them for the fool they are and move on, what's the big deal?

For Starters:

https://en.wikipedia.org/wiki/Vaccine_controversies

https://letters2president.org/letters/18983

https://en.wikipedia.org/wiki/Salem_witch_trials

https://en.wikipedia.org/wiki/Racism

https://en.wikipedia.org/wiki/Sexism

https://www.ushmm.org/educators/teaching-about-the-holocaust...

I mean, really? Come on, a stupid population is so obviously a bad idea. Remember the saying: Play stupid games, win stupid prizes? Guess what happens when everyone around you is stupid? Not like 23 chromosomes stupid, but 'Oh, the Earth is Flat' stupid. Good god man.


This is one of my worst nightmares, and seeing it become reality is very depressing. Slowly eroding education system, avalanche of fake, malicious, agenda-driven content from all around you, what's a man to do?


Downvote and move on.

Call it out when you see it happening, damn the consequences.

Generally, follow the responsibilities of citizenship (https://www.uscis.gov/citizenship/learners/citizenship-right...):

Support and defend the Constitution.

Stay informed of the issues affecting your community.

Participate in the democratic process.

Respect and obey federal, state, and local laws.

Respect the rights, beliefs, and opinions of others.

Participate in your local community.

Pay income and other taxes honestly, and on time, to federal, state, and local authorities.

Serve on a jury when called upon.

Defend the country if the need should arise.


You're looking at this on a personal level; I'm trying to look at it on a bigger societal level. I don't want to live in a country populated by nazis and flat-earthers, no more than I want to live in a country where you can smoke in restaurants, drink and drive, or advertise cigarettes to children.

I might be able to see people for the fools they are, but I still have to live with them, live next to them, and live with the impacts they're having on society, and so will my children.


I have a 4 year old, and I let her have some limited video time. I am familiar with most of the videos mentioned, but this is the first time I have heard of violent version.

I pre-screen what she can watch, and I limit it to about 20 minutes max. YouTube is not on the list, but I will occasionally play music videos as she likes to sing.

Amazon has a number of kids based learning shows Creative Galaxy, Tumble Leaf, and Luna. I wish they would expand out to more seasons as these have been very well thought out. I have not found any issues with these.

The real solution in my opinion is to screen stuff before you let you kids watch it. Leaving a screen open to a site like YouTube is asking for trouble as your not sure what is going to be suggested in the sidebar that they might click on.


We used to let our kids have semi-supervised YouTube Kids time on our iPad, but they were getting into content that wasn't age appropriate through recommended videos. (The app now lets you turn off search and recommended videos, so the majority of content is from vetted, more family-friendly producers like Disney, Nickolodeon and Dreamworks.)

But even if we selected good videos only, we came to realise that (along with video games) we were abdicating parenting to a screen so that we could ignore our children for a while. This is to some extent a useful stress relief strategy – we all need a break sometimes – but without being anti-technology about it, there are better things children can be doing with their time.


Many people here mention "parenting" as a solution to this problem, and although this is a solution available to the wealthy and time wealthy audience of hacker news, the other side of the world, probably in your own country and certainly in poorer countries, the people aren't able to put in as much attention into monitoring their children or to put in time understanding the many harms of the modern world.

The real problem is the mass effect of these people. The majority of the world.

We all like to think we are above manipulation, maybe as a form of egoism, but in reality we are all simple creatures compared to the momentum of science.


>We all like to think we are above manipulation

I never could understand why people have such a hard time admitting that a literal TEAM of PhD owning Psychologists who have spent their entire lives designing ways to trick and manipulate you could possibly be successful. That'd be like saying Ford couldn't possibly build an engine better than the one you hacked together from plumbing parts, just because you can't fathom the concept of people being good at their jobs


This might not be a comment people expect. I am a lover of abstract and surreal art and music, and while I personally don't like surreal stuff that gets gory or sexual or violent, some of the "off putting" bizarre videos the author shared here I found fantastic. I really liked the "wrong head" video for example.

I'm not sure how anyone is supposed to take that, perhaps it indeed is a sign these things might not be for children. :) I'm not sure how I'd feel if my nephew was subjected to this.

But with that said, I do want to remind people that there were abstract and surreal themes and art in children's media before. Winnie the Pooh had the "hevulumps and woozles" and then there was Halloween is Grinch Night, with the bizarre ten minute stretch of ghouls and shifting shapes trying to scare the main character. There was violence in Tom and Jerry, Looney Tunes, and almost every cartoon I watched. The gore and extreme sexual content are probably a bridge too far though, but it's not some of the milder content is that far outside what we watched as kids.

EDIT: I do admit though that this is different and I mostly agree with the author. The surreal stuff in kid's cartoons had some artistic intent. The "surreal" stuff here is essentially unintended, and merely the shoddy examples of a system meant to exploit children.


Yea, as nightmare fuel some of that is fantastic. But the larger problem is YouTube autoplaying and automatically recommending it to a naive audience. And the larger processes that incentivize people to make this stuff for children.


This is a such a cyberpunk story. A malevolent Library of Babel.


My own thinking on this has twisted from cyberpunk to lovecraftian lately. The best analogy I've been able to make lately is the warnings in classic fairy tales that fairy places are dark and full of horrors, and it is easy to be trapped within them if you aren't careful. (I even wrote a short story in a couple of days just recently trying to capture some of my thoughts and concerns/fears in this space [1]. I don't have much in the way of answers, either, just existential dread that I hope may lead to answers eventually.)

[1] http://blog.worldmaker.net/2017/10/18/astral-plane-meteorolo...


Indeed! But then again, I think the genre boundary between cyberpunk and fairy tale was always permeable. In Count Zero, evolved AIs posses people like vodoo deities. In Serial Experiments Lain, dead people live on as ghosts in cyberspace. While Snow Crash plays it for laughs, the idea that Ancient Sumerian gods live on as latent patterns in our brains (ready to override and erase our consciousness if the right word is spoken) is pretty Lovecraftian too.

Maybe one of the purest exponents of this "Lovecraft-via-cyberpunk" aesthetic is Nick Land; his early works are very influenced by 1980s cyberpunk science fiction, and (as far as I can make out from his rather opaque writing) his main thesis is that the combination of AI and Capitalism will evolve into an entity literally beyond human understanding, which will get rid of us all as it goes on to pursue its own inscrutable goals...


Absolutely, did not mean to imply I wasn't standing on the shoulders of giants in that genre cross-over, more that the existential dread seems to get more dreadful today [0] than in early cyberpunk. In addition to examples you list, certainly Charles Stross' Laundry series is a particular influence for me here as well. (One book of which was so close to home I felt a need to only read it outside during daylight.)

[0] ETA missing splice: as opposed to "punk"/DIY/rebellious/creatively-destructive. It's almost hard to imagine we've crossed some blurry line where early cyberpunk seems almost positively hopeful compared to reality.


Your last sentence also sounds a lot like Economics 2.0 (and some other things) from Charles Stross's magnificent book Accelerando.


That work is rumbling in the back of my head in every bitcoin discussion.


John Curran (ARIN CEO) read this post at a conference in the spring: http://astercrash.tumblr.com/post/157419046864/did-anyone-no...

Like we’ve got this dimension right next to ours, that extends across the entire planet, and it is just brimming with nightmares. We have spambots, viruses, ransomware, this endless legion of malevolent entities that are blindly probing us for weaknesses, seeking only to corrupt, to thieve, to destroy.


Read/watch Coraline by Neil Gaiman if this sort of thing tickles your fancy.


Just some irrelevant feedback. I was unable to read the text of the story without inverting screen colors. The contrast was too low and lines too narrow.


Thanks for the feedback; narrow lines should help the contrast issue in theory (less eye movement to read a paragraph), but I know there's a wide range of visual acuity, and I can't account for it all on my own.

That said, I realize the current contrast level of the main body text is an uglier compromise than I would like, so I've filed an issue on my little CSS library to remind me: https://github.com/WorldMaker/lcars-moderne/issues/2

Also, I've tried to make sure that the Reader View in both Firefox and Edge works well with my blog. I'm still surprised Chrome has not enabled its Reader Mode by default like the other browsers.


> The example above, from a channel called Bounce Patrol Kids, with almost two million subscribers, show this effect in action. It posts professionally produced videos, with dedicated human actors, at the rate of about one per week.

This is basically lifted straight from The Diamond Age. Stephenson foresaw something weirdly like this, except it wasn't automated.


Another good short story along these same lines that was on HN a while back: http://nautil.us/issue/53/monsters/the-last-invention-of-man


I think some of those videos are disturbing indeed, but I have to remember the kind of books that were considered childrens' literature when I was growing up: Oliver Twist, Nobody's Boy [1], the Brothers' Grimm fairytales, and various folk tales chock full of dragons, ogres, giants, evil witches, curses, evil, death, blood and torment.

Isn't Little Red Riding Hood the most common fairytale kids in the Western world are nursed to sleep with? That's a story of a little girl and her grandmother being eaten by a wolf, who's then eviscerated with an axe.

And, because I'm Greek, my readings included tomes of Greek history and legends "for kids", where someone always got killed, exiled, adbucted, raped, turned to stone or transformed into an animal, an insect [2], or a tree- including of course the story of Prometheus. You know- the guy that got chained to a rock and had an eagle eating his liver every day.

I read all that as a pre-teen. As a teen I read H.P. Lovecraft and Harry Potter. Harry Potter is probably the most benign, a series full of people getting murdered left and right, in all manner of grim ways, killed by monsters, their soul sucked by evil spirits, exiled to nether dimensions and blown to bits by spells cast by evil warlocks. Lovecraft is probably too scary even for some adults.

And yet, guys, I'm alright, OK? I'm not an axe murderer, a maniac, or a psycho. I'm not violent, I'm an independent adult, I'm not a criminal, or a drug addict, or an alcoholic, or a gambler, or all manner of fucked-up things that little kids can turn to when they grow up in this cruel world.

So, what's up? Is it really the themes of violence and terror that's harmful to kids, or the way those themes are presented? I certainly would rather my kids read old Hesiod, than watch those mind-numbingly stupid "kids'" youtube videos. But, if I could survive the story of Prometheus without turning into a psycho, I think most kids should be able to outgrow Hulk and Spiderman being buried in sand to their necks.

____________

[1] The only thing more depressing than that book are the books by A. I. Cronin, which I got as birthday presents in my teens.

[2] Yeah, I know insects are technically animals, too (as in members of the kingdom animalia). But, come on. Not animals animals...


I was also an early reader, reading content you would consider adult long before I hit puberty.

However, it was also obvious as a kid—and I am curious if this is a common experience—that I couldn’t regulate my emotions when watching movies. Scary movies would give me insomnia; i would literally burst out crying when watching something even emotional on screen that I couldn’t understand.

Meanwhile, with books, things are either understandable or not. You learn quickly that humans are weird, cruel, loving, etc—but there’s always that wall of explicit fiction, where you can pause mid-sentence just by looking elsewhere, and it leaves you in control of the experience.

Movies force you to process the information at the rate of the film, and that is honestly far too quick for children to understand everything adults easily see, and images don’t pass like words do. Hell, I still remember the scary scenes from snow white; but only one book has ever scared me—the raw shark texts.

I imagine I also excercised some self control when I saw some stuff I didn’t want to read—I remember putting down the bourne series at age eleven after a depicition of sexual assault, but mostly because other things interested me more. (Probably the ender series—very little sex, lots of games and prognosticating!)

Tl;dr give your kids pretty much anything to read, but movies are more intense and you should watch them with your child to pause and explain or just shut off.

P.S. fairy tales were aimed at adults. Children were considered small adults. It was a recent fabrication that somehow these stories are intended primarily to entertain children. I believe this happened primarily after he Grimm brothers did their work, though.


Yes. Reading a story to a child is an inherently moderated experience. If the child gets scared, you can stop reading. You can also filter out inappropriate content in real time—some of the Andrew Lang fairy tale collections for example have racist language and anti-Semitic content.

Reading is active while watching video is passive. You have to work your way through a text, instead of mindlessly letting a weird video play on to see what's next. Reading requires more motivation and interest in the material.

Lastly, the main issue the author identifies is how cheaply-produced and algorithmically-generated video content will necessarily generate weird and disturbing content in a way that doesn't occur with authored books. The author is most disturbed by videos that are not human-led, or where machines are splicing together human-led content with parodies and gross-out videos.



The conversation about parenting is somewhat of a red herring here. This is not a ‘won’t somebody think of the children moral panic’, but a serious consideration of the unintended consequences of the systems we are building and what that means for us as a society.

YouTube videos for kids is a fairly lighthearted entry to the subject - we could be discussing news, porn, education, whatever, and the problems and implications would be the same.

The key issues raised here are:

* The ‘delamination’ of content and author and how that affects the awareness and trust of its source. If, for example, a scientific paper and a bunch of woo is presented in the same way who can tell which is more legitimate? ‘Just teach critical thinking’ is not an acceptable solution as by the time enough people have been taught to provide herd immunity we will have long since succumbed to this pandemic of bullshit.

* ‘...the impossibility of determining the degree of automation which is at work here’. If both humans and machines are creating content tailored for every possible niche, interest, fetish and keyword combination, and algorithmic personalisation makes it possible to exist entirely within our own personal tag clouds what does it mean for us as a society, which requires a basic set of shared values to function?

Junk content and pandering to base instincts is not new, but our ability now to automate the creation and dissemination of such content to pander to every possible interest and unlikely combination of interests at vast scale is new - and we do not yet have the cultural toolkit to deal with the sheer quantity of it.

I wonder whether in a few years time, once we’ve really felt the impact of all this, people will look on today’s enthusiasm for putting ‘social’ in everything with the contempt and horror we do with last century’s enthusiasm for putting radium in everything.


Solid description of the important part of this essay. The automated generation of content that perfectly matches to humanity's unconscious desires is terrifying to watch.

I also worry that it's inevitable. Even if YouTube or PornHub or Facebook decided to introduce some sanity or human oversight in their system, that's such a massive waste of resources that competitors would beat them over. It's a race to the bottom for automation and machine learning. I genuinely can't think of any way to "go back in time" at this point, other than perhaps moving the entire internet away from its dependence on advertising.


> Once again, a content warning: while not being explicitly inappropriate, the video is decidedly off, and contains elements which might trouble — frankly, should trouble — anyone. It’s very mild on the scale of such things, but. I describe it below if you don’t want to watch it. This warning will recur.

This, before an animation that has a happy song with characters and an animation of a little girl appearing periodically and crying briefly. How sheltered do you have to be before this is traumatic? It makes me wonder whether the author is a bot.


I believe there was recently (this year) a big furor in the right/alt-right about this sort of thing, except it went much deeper, somewhat a la "Pizzagate". It felt very conspiracy-theoryish. I didn't read a whole lot into it, since it seemed pretty crazy, and the videos linked were, admittedly, pretty disturbing.

I now wonder it there's really something to this, especially with regard to people intentionally messing with/traumatizing kids via these videos.


Ehh... well I'm biased because I would be labelled by people as alt-right.

Basically, just use your intuition... use it boldly. What does it tell you about these videos? It's weiiiird


Can we change the title? There are two things wrong with the internet on the front page currently.

The first is children's videos. The second is the actual internet itself.


Some things I'd like to see happen with the internet. There are probably opportunities here

1. Proper age validation on pornography. The "You promise you're over 18/21?" is not working.

2. A "safe" slice of the internet provided by an ISP such that more or less content matches the ratings systems open DNS does this but is easily defeated by an educated user. Also this on phones would be good.

3. A government regulatory body that inspects the real effects of consumption of these products. Porn, infinite scroll sites (fb, intstagram), youtube video content etc are all known to be tailored for addiction and yet no controls on them exist. Though we do the same with tobacco, food additives etc.


But this is hardly new. Anyone else watch "Dora" videos and notice how it's really a LOT like someone playing a really bad Sierra adventure game ? And it's not like that's the only show doing this.

In Belgium, there's a TV character called "bumba" whose videos (on TV) are like someone injected LSD into a seal, glued a hat onto it, stuck into a bag and filmed it trashing around.

Almost all recent kids content is algorithmically generated, on the "real" tv as much as on youtube.

As for toy unboxings, they're actually useful I would say.


Why was email spam so effectively fought, but "next up" spam seems to thrive?


Because SMTP was an open protocol. No MUA company profited from its end user seeing spam. No end-user-targeted MTAs (i.e., the MTAs of most end users' ISPs) also found spam as a cost center. (I'm being pedantic here in that the rogue spam-friendly "Spamhaus" ISPs whose customer base was made up of spammers were technically operating MTAs as injection points. To the overwhelming majority of email users, admins, and providers, spam was a massive cost center.)

At some point in the 90s/00s, ISPs had to bite the bullet and acknowledge that if they didn't filter spam, customers would leave for ISPs who would. And in the SMTP era, y0u could use b4y3s14n techniques to f1lt3r out the v1@gr4 spam.

We're not at the point where the video equivalent of spammers can be automatically detected. So it's going to be expensive.

And in an age of walled gardens and ad-tech, the providers of "next up" and "recommended for you" clickbait/spam are not losing money (by having to pay for storage/bandwidth for spam that their end user merely deletes or filters client-side). Rather, sites like YouTube are profiting off of every clickthrough. An ad view is an ad view.

And that is why they have no business incentive to pay the money to hire the humans that are still required to solve the hard problem of differentiating spam from ham. Doing nothing, or pretending to be able to fully automate the problem away via still-error-prone AI, is cheaper than solving the hard problem of curation.


This seems reasonable to me.


I recall a time when email spam was not so effectively fought. I was far more skeptical of posting, linking to, or generally giving out my email due to the potential onslaught of spam afterwards. I assume the challenge here is similar, but for video instead of text.


Google's Search and Youtube algorithms have encouraged the production of a deluge of low quality content and made it impossible to find high quality content. They are ready to be disrupted.


How would you disrupt them?


parents do not worry... hip hop killed the evils of rock and roll, and the internet killed the evils of TV. I'm sure something is waiting around the corner to resolve the evils you fear in the internet. in the meantime, take the attacks you perceive being weighed against your children as reason for you to spend more time with them. it may be that the internet isnt a unique problem after all


I would like to know the exact intention of Youtube Kids. Is it truly an opportunity for the company to provide a safe place for kids? If so, how exactly are they filtering the contents?

I know, i know, it's free, don't use it, be a parent, blah blah blah. But it is a service being touted as being "kid-safe", are we not allowed to question?


Ok, wtf moment here, is this snark?

FTA: "Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level."

All the examples are basically just kid-crack but I've watched these with my kids and they're (mostly) relatively harmless - exception being the weird/slash versions involving scary or sad versions of Peppa pig, paw patrol, etc.

What trauma/abuse is being inflicted here?


Those exceptions are found by the millions. There are depictions of rape, cannibalism, gory death and all kinds of violence. Not even safe for <16 year olds, and we are talking about little children.


Nothing is wrong on the internet.

Something is wrong with parents who don't care what content they put in front of their kids as long as it looks like a cartoon and it shuts them up for a while. Then again, Looney Tunes characters were a lot more violent than this and no one got traumatized, so maybe the effect of this is blown out of proportion.

Yes, this stuff is vapid, and some of it is disturbing, and I suspect might be lowkey fetish porn, and it definitely infringes on copyright (who cares though,) but... the internet isn't your babysitter.


Are you a parent? The way YouTube works - and how centralized video has become - means it's extremely easy to let a kid fall into these dark areas just by playing one innocent video. There is no obvious option to hide the rest of the UI, stop autoplay or recommended videos.

It also is nothing, absolutely nothing like Looney Tunes. Please watch the videos linked by the article before making such a statement. In one of them, Peppa Pig uses a razor blade to slice a piece of her own arm, right after eating her own father.


> In one of them, Peppa Pig uses a razor blade to slice a piece of her own arm, right after eating her own father.

You mean the one clearly on a channel not officially associated with Peppa Pig's creators and that starts with a fake restricted content warning? Ok, yeah that was dark. But Youtube is not a G-rated site, and it's designed to make content discovery easy.

Doesn't Youtube have parental controls? If that's not enough, maybe there's a need that needs to be filled by a browser plugin that restricts the UI or external player. Either way, letting a child on to Youtube unsupervised is taking the risk of having them stumble across content you don't want them to see.


Right, that's the problem he's identifying. Search for "peppa pig bacon" take you right to that parody video, which is widely pirated...and given the weird, machine-generated examples he cites earlier, how much of that ended in a compilation a human child might see? It's not even that there is inappropriate content on YouTube: it's that these channels are trying to trick children into watching stuff that is so bizarre.


It's the working out of Tim O'Reilly's mechanical turk idea.


My kids are allowed Hacker News and that’s it.


I'm really, really WTF.

I see nothing wrong with these videos at all and the current general opinion here is the real worry.

Not sure what the top secret videos the author found are that are so bad I can't see them but I totally have zero faith they back up the authors point whatever it is.

I think the author needs to read https://en.wikipedia.org/wiki/Grimms%27_Fairy_Tales and get over their moral panic.

There are possibly trolls out their that might create content with the purpose of harm, but i see no evidence yet they are succeeding in doing this, especially NOT in this article.

Mixed with this normal Moral Panic is the old Barney the Dinosaur Panic.

Shows that don't work on two levels (So just on a child's level) scare parents because they don't get them, the kids do, so the parent fear the unknown.

Screen time for kids is not on topic here, not allowing kids to see videos for adults is not on topic here, so really there's nothing to see here with evidence.

Unless the top secret videos do show evidence of videos children will accidentally come across while You tubing.

Using automation to monetize is pretty interesting stuff though. There's the interesting story.


I share your opinion on the banality of content-related moral panics. I don't think the subject matter covered matters much at all, since young children can't even comprehend the themes that horrify the author.

What really strikes me about these videos, though, is the sheer volume and the degree of engagement optimization demonstrated. That's new and worth thinking about.

We've had scary nursery rhymes and fairy tales for centuries at least. We've had nonsensical children's cartoons for half a century. Nothing has gone horribly wrong.

But I don't think we've ever had industrial-scale production of videos so rigorously optimized for engagement at the expense of everything else.

These videos remind me of the concept of "superstimuli" --- artificially-created patterns that are more attractive than the ones that appear in nature and that lead organisms away from healthy behavior patterns. Highly refined sugar-loaded food is an example of a superstimulus in humans.

These videos are empty. They teach nothing. They're optimized for hitting all the right attention buttons in the brain, but they don't provide any of the training child's developing neural network needs to train itself. What effect is that going to have on development?


I definitely see your point of view and agree with superstimuli. Not necessarily empty though, you do have colors, words, sounds, rhythm, surprise, movement but not patience, conscientiousness etc

But I think it's time allowed using electronic entertainment not the content of the electronic entertainment. I think all youtube is superstimuli.

It possibly should be ZERO time, but if youtube for short periods allows the kids to go on a family trip or keeps parents sanity to allow for a dinner out perhaps there's value externally.


I wonder if you're a troll, but on the chance you aren't:

> I see nothing wrong with these videos at all and the current general opinion here is the real worry.

I wager you have no children. If you did, and you had made the observations I have, I think you'd feel different.

> I think the author needs to read https://en.wikipedia.org/wiki/Grimms%27_Fairy_Tales and get over their moral panic.

From the Wikipedia article: 'The first volumes were much criticized because, although they were called "Children's Tales", they were not regarded as suitable for children'

And even today, I think many of them aren't suitable for children (I found them vaguely unsettling as a child).

I have a child small who used to be allowed on Youtube, but we've since disallowed it: the content was just too weird in a bad way, obviously not beneficial (it was amusing in the sense that crack is amusing), and amazingly addictive, pretty much like visual crack (to a degree other videos aren't).

I can say that there's something unsettling going on, not merely from my opinion on the value of the videos (which I find annoying at best), but from the observed effects.


> I see nothing wrong with these videos at all and the current general opinion here is the real worry.

Help me understand how you see "nothing wrong" in videos that are clearly "designed" to get kids (0-10) to watch, but then depict suicide, murder, rape? All those examples are in the article, and you can watch them yourself.

You are thick-skinned, and most us adults are. We can see beyond the stupidity of those videos because our values have set in. But kids can't. Kids consuming this stuff in their formative years is troubling. How you don't see it is actually staggering.


I'm sorta convinced that out of boredom, many people that are involved in creating children's content will try to push up as close to the line of putting adult innuendo in their work as they can get.

See half the jokes on SpongeBob, or this gem that I saw in a Disney coloring book at the doctor's office the other day: https://me.me/i/mickey-tosses-the-salad-dinney-while-colorin...


I think that's just you.

He's literally tossing a salad in that drawing.


[flagged]


Got any specific examples? Don't think I've run into anything like that.


I guess there are two sections to it.

There is now a lot of media self created media for gay teens and young adults. Media in that YouTube realm occasionally leaks or crosses lines for folks. Depends upon the channel and upon the particular video. Some channels clickbait with decidedly adult thumbnails and descriptions. For example "Davey Wavey" their channel is mostly NSFW. Issue is some folks don't mark their videos as adult and thus are in younger audience streams. It's really on a channel by channel basis. I haven't dived into that space recently. Though I can see that market effect coming in play incentivizing folks to be more provocative in their thumbnails in return for more views. Nothing necessarily new or unique to the gay community. Main issue is creating lines between adult content and non-adult content and recognizing that channels are inclined to purposefully gray that divide. Channels want to be in larger more public non-adult area, but want to include adult/"clickbait" aspects in there videos to improve view counts.

On the propaganda bit I don't think this effect comes really into play with political/activist videos other than them not accruing as much attention.

On another end. There has been criticism within the gay community and it's objectification of relationships.

https://broadly.vice.com/en_us/article/zmbb45/how-gay-couple...

This would fall closer to the off/oversharing "Family Vloggers" who shoot daily videos of their families in hopes of turning a profit. Relationships like individuals are spun into brands and are monetized.


Think of stuff like this: http://www.theblaze.com/news/2017/06/28/queer-kid-stuff-yout...

It got so bad YouTube had to mass demonetise all LGBT videos.


(I'm going by the article text, can't see the video now) I'm curious why you think it's propaganda, or why you think it's bad? I don't think it's aimed to change anyone's mind (what propaganda is) - more for some kids that already think they're different in some way, to show that other people feel that way too and are ok with it. We could discuss what's the age appropriate level for various topics, and things like that. But if it's not relevant, you don't have to watch it. Nobody's going to convince a kid to be gay any more than they can make them straight (and unfortunately, many have tried that with terrible results), so I wouldn't call it propaganda.

If anyone is forced to watch it, that would be terrible but because of the forced part rather than the content. Otherwise, I'm happy this exists if it makes anyone more comfortable with what they feel.


Right, that's why they were demonetized. Because of 'gay propaganda' on a mass scale.


Hello, if you are being ironic can you please tell me why it was. I saw that article and I've seen lots of really bad examples of sexual propaganda directed at kids. Screenshots that circulated around the conservative sphere of Twitter. I presumed this was the reason Google eventually decided to demonetise gay propaganda in YouTube.

Look at this I just found. Unfortunately I can't find the tweets I am talking about. But this is similar. https://aceloewgold.com/2017/07/25/popular-youtube-channels-...


You're sharing a site that peddles garbage like Pizzagate. Give me a reason why I should ever believe anything you say.


You're free to look for those videos using the search feature of YouTube. I did and they are real. Some of them feature thumbnails or content that are clearly disturbing or include sexual themes.

You would also believe the BBC which is linked from the article. http://www.bbc.com/news/blogs-trending-39381889


I'm going to assume you're not familiar with 4chan-like content, rather than trolling, so I'll try to summarise. Some people enjoy disturbing images/videos and are messed up enough to try injecting it in other contexts just for fun. When the previous article mentions something like "A discussion on 4Chan led users to discover a code hidden in the comments sections of some of the videos that led them to a Twitter account." what it likely means is: someone uploaded videos with extra information and then started themselves a spoof discussion that bled to other social networks via controversy seekers.

4chan-like communities are the source of things like pizzagate, raiding other networks to inject disallowed content, or doing things like posting videos of flashing colours on forums for people with epilepsy. Just getting someone reporting on it fuels creation of more videos like this.

It's not that the videos don't exist - it is a problem that kids can run into them by accident via genuine search terms. It's the coded message part, mentioning pizzagate and podesta, mkultra, etc. that's the bullshit part in the aceloewgold article. The simple explanation is that: some weirdos like to create this content and some idiots crave the controversy of making others watch it by accident.


It's hard to argue Queer Kid Stuff is inappropriate for kids unless you think LGBT people are inappropriate for kids. For example, lesbians are defined as "girls who love girls," which is no different from how straight relationships are discussed with children.

Ultimately, the basics of LGBT people's lives and identities are not actually hard to discuss with children and are a fact of life, particularly for kids who are LGBT themselves.


Um.... gay propaganda ?


In the future, the only skill of value will be video production. This is common theme, for example in Vernor Vinge's Rainbow's End. Getting and holding other human's attention is always worth something.

So possibly all these weird videos that kids watch constitute necessary future survival skills.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: