Hacker News new | past | comments | ask | show | jobs | submit login
Better protecting kids’ privacy on YouTube (googleblog.com)
277 points by geddy 16 days ago | hide | past | web | favorite | 357 comments



This doesn't solve the number one problem of kids content on YouTube -- they are leaving it up to the creator to decide if it's kids content. It's still not human curated.

The nice thing about Netflix/Hulu/Amazon/PBS/etc. is that a human decides if the content is for kids.

YouTube is still trying to get away with "letting the machines do it", which works for most aspects of their business, but not this.

I used to let my kid watch YTKids while I supervised. But no more. Even with my supervision it just jumps to some insanely inappropriate video, too quick for me to shut it off in time, and it's already done damage.

When a scary clown jumps out and screams, that's traumatizing, even just for a moment.

I have no trouble letting them watch all those other platforms. Heck, I let them watch PBS unsupervised.

But YouTube Kids will never play in my house again until I know that every piece of content on there has been reviewed by a human before it goes live.


>YouTube is still trying to get away with "letting the machines do it", which works for most aspects of their business, but not this.

Citation needed. I mean, they do use AI for lots of stuff, but let's not describe it as working. It has routinely caused massive aggravation for their biggest content creators, it has been one in a group of changes that has basically made Youtube an unsafe monetization platform and driven the creation of places like Patreon,

Sure, this is the only AI fuck up that is going to get them sued in federal court and face charges of exploiting children, but it's not like the other AI stuff is working well. Basically, their AI is doing a bad job everywhere, it's just that this area is where they face legal liability and so they can't pull out their usual "but the machines told us to!" defence.


Heh, I was giving them the benefit of the doubt. They've been running on "algorithms" since 2005, and are still around, so I assume it works for them for some definition of working.

But you have a point. YouTube and Google get a lot of flack for things that are emergent properties due to their lack of humans.


This is survivor bias.

YouTube doesn't release many of the details of the methodology or even the content that they block because it can then be circumvented by the people who create and/or circulate it. It can create a streisand effect which only makes that type of content worse by drawing attention to it.

Some of what you can see is in the child sexual abuse material (CSAM) APIs that Google is now sharing to help other organizations detect CSAM material en-masse in their content. These services that were originally built to do so on GMail, Drive, and Youtube.


AI is really bad ad making judgement calls on things like fair use, advertiser friendliness, controversial content (eg. LGBTQ content), and now whether or not a video is directed to kids. See https://www.cs.princeton.edu/~arvindn/talks/MIT-STS-AI-snake...


To be fair, they're the largest non-pornographic video platform. Making them an endless target for any and all abuse possible to grow channels quickly and earn ad revenue.

It's inevitable that in such an environment the system they come up with ends up making less and less sense over time. It's death by a thousand cuts, and I don't think there's a good way to really avoid it.


That's a really schizophrenic argument.

On the one hand because they are this large, they cannot currate content. On the other hand they are arguing because they are this large they are the only ones who can currate this content.

I would feel more sympathetic to your argument if the companies wouldn't argue both cases at the same time.


> I would feel more sympathetic to your argument if the companies wouldn't argue both cases at the same time.

I don't really care about sympathy, so much as I care about reality. The companies will do as all companies do. Try to make money by garnering confidence in their product & abilities. Even if they have to lie through their teeth.

Reality is, there isn't a good system (currently) to really curate massive amounts of user submitted video content, in an environment where anyone anywhere can submit said video content.

The problem's scale is always going to lead towards either a compromise of being an open video platform, or a compromise in relation to curation.


Did the inappropriate content happen with a certain age range set? I totally believe this happens / feel like a bad parent for not cutting it off, but we have the app set on 2 year old and under and I’ve seen a lot of dumb stuff but nothing inappropriate yet. I’m just curious if this becomes an issue that boils down to where you set the age range or if I’m just playing an epically long game of Russian roulette right now.


A friends 1 year old was watching YT unsupervised. Some of the videos were super creepy. Like cockroaches coming out of LEGO bricks and being smashed (coackroach blood and things) and then they reversed and the cockroaches multiplied and took over LEGO tower. Kid was super memsmerized. I was totally WTF-ed.

100% agree. YT kids is not safe for kids. I wouldn’t show it to my kids. I instead download the videos I want and manually make a playlist. There is wayyyyyyy too much garbage in YT kids.


My problem with YT Kids is that the business model of YouTube is inherently incompatible with children and law. Here in Sweden advertising targeted at children is banned[1]. YouTube Kids is more or less all product placement, influencer "reviews" and sponsored content. Why would a child need thousands of videos? I would curate and download a library of videos.

[1] https://theculturetrip.com/europe/sweden/articles/sweden-ban...


Is it just me to see a bigger problem here - what the fuck is wrong with people to let their tiny innocent offspring just watch some fucking dumb TV?

Is this the best they can do as parents? I mean, we all know just dumb sitting on couch and passively consuming whatever goes in TV ain't healthy mentally (and physically), and they ingrain this approach in their kids since very young age?

Parenting ain't easy, never was and probably never will be, but damn invest some proper human time in it, don't just offload raising of your kids to some stupid website which lives off ads. This is best invested time of your life.

I have a 2-week newborn, and damn he ain't getting close to TV/phone/tablets till much much later, and even that in limited manner.


Well I was exactly like you with our new born.

After 1 she also got increasingly harder to feed. She would throw away the spoon with her hand and not open her mouth. Just a massive rebel in general. We were really stressed out. She was starting to fall off the growth curve. Pediatrician was concerned too.

Then while eating we once showed her ABC kinds wheels on bus go round and round rhyme video, not sure what clicked in her brain. It was a night/day difference. No rebel , eats well.

So we kinda use it as the last resort and feel a bit guilty.


Before you’re a parent you kinda have these conceived notions of how you will parent that are all based on a self-centered view. Which i say in the sense of the word, they are views on how you are going to parent because you think _____ (a you-you combo). Then you have a kid and the reality hits — this is a young person with their own ideas. You have ways that you want to parent that are going to conflict with what they want, so you have to navigate and negotiate.

You used something they wanted as a carrot stick (couldn’t not) to get mutually beneficial behavior, you’re still gonna feel guilt but this seems well done to me.


Sounds like the people behind the bootleg Peppa Pig videos are still going then :( https://www.theguardian.com/technology/2018/jun/17/peppa-pig...


> A friends 1 year old was watching YT unsupervised

Parenting is not something you can automate


Number 1 issue here is parents allowing a 1 year old to watch online content instead of interacting with the world and playing with them.

Full stop.


Thanks for your judgement.


You're welcome.

I'd rather judge and shame bad parents now than have to deal with their poorly behaved offspring later in society.


I guessing that judging and shaming a corporation's contribution to "poorly behaved offspring" is right out though


Or just play PBS kids, which has apps on multiple platforms, good quality, and totally free.

Right now my kids are small. I think PBS kids should be enough for them. When they want to find something else to watch, I know they just had too much screen time and they need to do something else other than watching TV.


You replied to a comment where their primary concern is, quote: "When a scary clown jumps out and screams, that's traumatizing". Not porn, sexual innuendo, or pedophiles looking for potential victims. Not war, or violence, or even brainwashing advertisements. It's clowns... their primary fear about YouTube's content is the possibility their sensitive child will be exposed to a clown.

It's impossible to satisfy everyone, regardless of the content rules, and no mater how curated. YouTube could spend a billion dollars curating content, along with surveys where 90% of parents have to approve a video before it is included, and the remaining 10% would still be the vocal minority ruining it for everyone.

Sure, YouTube probably doesn't do quite enough. But when you have parents looking to let YouTube babysit their 2-8 year old children without supervision, whose fault is that? YouTube's, I'm sure. /s


A scary clown. Like Pennywise. Randomly popping up and loudly screaming in the middle of an educational video about buses. I don't want my kids being exposed to that either, and I don't know any parents that would. The commenter wasn't talking about letting their children watch unsupervised, even supervising your children won't stop this type of thing from doing damage.


[flagged]


I'm intrigued that you think that a video that causes persistent nightmares, being afraid to go to bed and trouble sleeping across several nights can't be described as 'damaging'


For something to be damaging there would have to be some damage somewhere. You wouldn’t describe a bad head cold as dancing and that will absolutely lead to a pissed off two year old with trouble sleeping for several nights.

When I taught kindergarten one of the children in the youngest class at the time was terrified of the Eensy Weensy Spider video and of the Elephant in a Hickory Dickory Dock video. I told her to close her eyes when it was coming up. She did. Eventually she got over it.

The idea that a scary clown can be “damaging” is a testament to helicopter parenting. Up to 1900 it would be a rare person indeed that didn’t have siblings die before they were ten and basically no one would make it b that far without having a friend or family member die. That’s trauma. That’s damaging.


> For something to be damaging there would have to be some damage somewhere.

You seem to think emotional trauma isn’t real. It is. And studies of its effects show that it is just as harmful to the brain as physical trauma.


I'd describe it as a learning experience. If you don't get opportunities to grow up, you stay a child forever. Having said that, I wouldn't allow anyone younger than six to watch anything on a screen anywhere.


I had a friend who lost a finger holding a firework. Learning experiences can be damaging.


Dunno man. Kids have nightmares about all sorts of things. They cry all night and want to be cuddled with.

If nightmares and screaming is repeated over and over again, it can lead to some very traumatic experiences in their young brains.


Kids want to be cuddled no matter what. That’s how humans are wired.


Look up the cartoon characters getting run over by cars, or the very disturbing rabbit hole of injection videos. Kids at a certain age take what they see and hear as core truth, and when there’s violence at hat level it leaves a lasting mark.


Have you ever read any of the original Grimm’s Fairy Tales? They are gruesome. Cinderella’s step sisters cut off parts of their feet to fit into the slipper kind of gruesome. Children have been dealing with this kind of stuff for a long, long time. Life leaves a lasting mark. If they’re terrified and nothing bad happens they learn they can deal with this. I wouldn’t do that deliberately to a child but lasting mark does not mean lasting harm.


Simply because trauma has kept past generations “safe” by teaching them to avoid any potential harm, does not mean it’s the best option.

Previous generations also used to “instil” a sense of fear in to children by beating them. It does make them listen as they are then ruled by fear-inducing trauma and yet I think we’re all clear on how it’s not the best option for their growth.


They were also not meant for kids.[1]

1. https://www.nationalgeographic.com.au/history/brothers-grimm...


You think a lifelong clown phobia doesn't count as damage?


The clown was just an example, and it's not just "a clown", it was something designed specifically to frighten young children, where the poster had gamed YouTube's systems to have it show up in the kids section.

And I never let the watch unsupervised. But at least with every other streaming service, I don't need to preview the content ahead of time.


Wow. Who /are/ these people? What is their goal? Are there really people on earth with the exclusive motive to scare a two year old and give her nightmares? That is some next level evil.


The source of these videos could be anything from lone assholes to a state sponsored program to destabilize western civilization. Without tracking down the person who uploaded a given video and hitting them with a sock full of quarters until they explain themselves, it is impossible to know.


YouTube Kids allows you to select which specific channels your kids can watch including selecting from a list of "trusted partner" channels such as PBS Kids. When you 1st install YouTube Kids it leads you through these options.


That's just putting the onus of human curation on me. Too much work.

I'll just use one of the many other services that does it for me.


I think the onus for curating a child's video content rightfully belongs on the one raising the child. How can YouTube (or any content creator) come up with a policy appropriate for everyone? Parents and guardians have different ideas of what their children should and should not watch.


Of course as the parent I am the final arbiter. But I'd like them to take the first swipe. Every other streaming platform for kids does this. Why can't YouTube?


I will channel urs hoezel giving an answer he once gave me to a similar question:

"it's too hard"

which translated of course means, "the cost benefit analysis doesn't come down in favor of spending the effort".


Actually it sometimes really means "it is too hard".


That’s why everybody else does it?


The complaint is they don’t curate kids content. Saying there are very basic tools allowing you to do so is hardly solving the problem. It’s vastly simpler to simply block YouTube for young children which also completely solves the parents problem.


Should I also personally send all sample foodstuffs to a commercial lab before feeding it to my child?

Caveat emptor, informed consumer, personal responsibility, blah blah blah.

What the Freedom Markets™ advocates don't acknowledge is transaction costs. If consumers have to vet every single activity, the economy suffers.

If YouTube Kids is marketing themselves as kid friendly, it'd be nice if it was true. My skepticism budget is finite.


Isn't it the same thing -- Whether you use PBS Kids app or choosing PBS Kids Channel on YT?


Using the PBS Kids app is so easy even my two year old can do it on their own.

Forcing YTKids to only show PBS Kids content on every platform is a huge pain in the butt.


Yes but Grrr Youtube


Not even close to the same thing, see above.


I hope that's sarcastic...

I'm just going to assume you are being sarcastic and making an ironic point about parents wanting to control what their child sees but then not putting any personal effort into it.

The harsh reality is that parents have to do work. I remember when I was a kid I hated that my parents looked through the books I checked out from the library to see if they were "appropriate" and that sometimes they would remove horror novels and other things that they thought were not age appropriate.

But the truth is at least they cared enough to check and at least they were consistent, and later on when I became a teenager they gave me more freedom. If you have opinions about what media your child has access to you have to do the work to curate...


I was not being sarcastic.

Of course I want to review the content too, which is why I usually watch new stuff with them. But I want the service to take the first go at it. Like every other streaming service for kids.

Same thing at the library. When I get a book from the kids section, I can be reasonably sure that I won't be surprised by some adult content when I sit down and we read the book together.


Just whitelist channels or specific videos.


Or just don’t use YouTube, that’s even easier.

I’m not arguing that YT is inherently bad for all purposes; just that, in this situation where you want to watch kid’s videos and be confident all the material will be age-appropriate, there are lots of apps that are simpler and better.

Rather than installing YouTube and whitelisting PBS Kids, why not just install the PBS Kids app instead?


That's called human curation. You're just putting the curation onus on me.

Too much work. I'll just watch one of the many other services that do it for me.


Use the family approved whitelist bundle your church has put out?

It still can be someone else's problem.


But why? Every other streaming service for kids does this for me. YouTube definitely doesn't have anything special enough to warrant any special effort.

Also, they still don't make it easy to lock down to just that content on every platform.

I don't have to worry about this with the PBS Kids app. I know that no matter where it runs it will be safe. My house, grandma's house, whatever.

Same with Netflix Kids. I don't have to worry about it having bad content anywhere my kid figures out how to use it.


It feels strange to me to compare a user-generated content site with a more traditional media platform. And then complain that the former doesn’t work like the latter.

I’ve never really understood how YouTube Kids was even a thing.


There are other things but first thing that comes to mind is tgat Youtube has several orders of magnitude more content than others. Netflix is not a similar service.


Isn't that what playlists are for?


Think about the depth of human suffering you’re asking to unlock with this request? Millions of YouTube videos get uploaded everyday, at least hundreds or thousands of those must be YTKids flagged. A white list of channels as offered now is much better until machine learning backed filtering gets a bit better.


You don't need to make a whitelist decision on everything though. Youtube Kids doesn't need to be like Facebook, it doesn't need to moderate everything that's posted. You can be highly exclusionary, you can ignore posts by smaller creators.

It's fine for Youtube Kids to be a much smaller selection of manually curated videos from specific partners, because Youtube Kids isn't replacing Youtube as a whole.

Given that their current solution basically kills ad revenue on those videos anyway, it's not like creators are going to be clamoring to be included on Youtube Kids in the first place. There's no harm in having a subset of Youtube that is highly gatekept, the same way that subsets like Youtube Red were.

Go ahead and let Youtube itself stay as a kid-unfriendly, algorithmically moderated wild west with hidden bombs of objectionable content spread around. That doesn't need to be gotten rid of, just have one corner on a separate subdomain that's nice and safe for kids.


> Go ahead and let Youtube itself stay as a kid-unfriendly, algorithmically moderated wild west with hidden bombs of objectionable content spread around.

That's exactly what the FTC fuss was about, though. It's easy to say that until a parent gives their kid access to it and suddenly YouTube is advertising and recommending stuff to a child, which the law doesn't provide a safe harbor from (not for the content uploaders, either, which is going to cause a lot of problems). The FTC guidance was to identify kids instead by the content they were accessing. Hence Google's complaint: https://youtube-creators.googleblog.com/2019/12/our-comment-... (tl;dr skip to the "Treat adults as adults" section)

edit: now I'm confused because your comment below (https://news.ycombinator.com/item?id=21975870) indicates you already know your idea here isn't going to work for the FTC?


You're completely correct. A curated Youtube Kids definitely won't work for the FTC, but it would address jedberg's complaint -- that Youtube Kids is basically worthless as long as it's algorithmically populated with shock/troll videos.

To be blunt, even the FTC's own proposal isn't going to work for the FTC (at least not to the degree they want) -- it's a bad solution with unintended consequences. I've brought up a few problems, but other people have brought up far more of them. At a certain point, beyond explaining the FTC's policy and showing the problems, it's not really worth spending the time trying to come up with a way to make it work. It's better to come up with solutions that would address the concerns of actual parents, like jedberg.

Importantly, the FTC's approach is the opposite of what parents like jedberg need. Parents need a highly curated feed that they can feel safe showing to their kids. The FTC's approach is going to lead to creators erring on the side of labeling too much of their content as kid friendly. It's also going to force Youtube to get more aggressive with its algorithms, which will inaccurate label objectionable content as kid-friendly.

Parents needs fewer videos more carefully classified as kid-friendly, and the FTC is making sure that a lot more videos will be carelessly classified as kid-friendly.

Like I said below, my assumption based on the FTC comments I've read/watched is that they just haven't thought that far ahead. They're mad about Youtube in general, and they see creators as an easy way to get at Youtube. That's maybe an overly-cynical take on it, but I'm not super-inclined to be charitable to them on this one.


point up this is a great idea, I hope some Youtube Kids PMs are paying attention.


You're using a lot of words to say that Youtube doesn't have a viable business model.

They don't. The cost of providing the service they pretend to provide would greatly exceed the income on that service. Even with the most aggressive advertising one could imagine.


what? that's a wild strawman, but I think it's instructive -- I think this is actually a blind spot for a lot of techies. Point blank: I don't care if their business model is viable or not, I care about the service they provide to society. the service they provide (video hosting for free for anyone) is incredibly valuable, almost incalculably so for the amount of creation that it has inspired. the fact that kids can see it is just a side effect of the core value, and so they should provide tools for users to help them protect their children from content they don't want to see. But does that mean that they need to hire an ARMY of humans to do that? No, in fact even if they could afford it, that would be terrible!


They might profit if automated modification of videos to insert covert advertising was permitted.


> This doesn't solve the number one problem of kids content on YouTube -- they are leaving it up to the creator to decide if it's kids content. It's still not human curated.

You can imagine how Google would automate some nightmares otherwise. Say they look at the age of the people viewing it and if enough people below a certain age view it, they automatically class it as content for children.

Of course ideally something in the middle would be a start, so they could see, what are children watching, let's check that out and see if it is in need of some 18+ filter or not.

Given they must be doing some checks, as in age for logged in accounts and making sure adverts are not something not for children.

>But YouTube Kids will never play in my house again until I know that every piece of content on there has been reviewed by a human before it goes live.

Yes, as it stands the systems Google have in place are not secure and by that, unsuitable content could find its way to your children when you have done all the right things to prevent that. Equally there is the accountability. If PBS drops a boob at the wrong time when children are watching, it will get dealt with and process in place. If YourTube kids do that, nobody else may know as nobody is checking an individual's stream over a network broadcast. Let alone the same level of accountability and process.


You can't compare Netflix/Amazon with Youtube. The amount of content YT must deal with is multiple magnitudes larger. There is no way humans can handle that much content, that quickly and accurately. They really don't have a choice at that volume. Either machines learn to do it properly or there won't be any filtering at all as humans wouldn't be able to keep up.


>You can't compare Netflix/Amazon with Youtube.

Yes we absolutely can. YouTube's business model depending on user generated content is their choice. Not being able to review all the content is entirely a problem for Google to solve. Maybe they should stop optimizing for growth and put up some hurdles to who can upload content? That's for them to figure out.


Or just have YouTube kids be a curated subset of regular YouTube content. How hard can that be. Doesn't need to be a dynamic amount of content.


This seems especially true for YouTube kids to me. Do kids really care whether the content was produced a year or a week ago? It's not a news outlet. We literally have enough content available for each kid's whole childhood. YouTube could decide to review and allow one new channel per month per language and the consumers wouldn't be affected.


> Do kids really care whether the content was produced a year or a week ago?

I'd be surprised if there weren't a bunch of kids who really wanted to see content from the latest Minecraft update.


A full time staff of 20 would be enough to scan enough content to outpace what any child could ever watch during their entire adolescence.


Yes, they care.


I guess I should've been clearer about the age/context here. There's a period when they'll watch some show on repeat a hundred times. Or a generic cartoon - it doesn't matter when it was produced.

Sure, it will start shifting once they want to watch something their peers watch. But at the time parents choose the media, how would they even know it's not the latest production, or some mix of old/new?

Peppa pig started in 2004. At the time kids are interested in it, if you show them ep 2, will they say "this is not from series 6 from 2019"?


They care when growing out of "kids" program and they care for specific series.

Even with a more restrictive approach Google/YouTube could whitelist specific content providers, which could provide fresh content on a frequent schedule.


I never understood this either. Why not go to a whitelisting model for channel owners and just start with large brands like PBS that have a vested interest in their reputation. You get the benefit of new videos from those channels and then slowly work through a backlog.


Now that ads are blocked, what is the revenue model for YTKids videos?

I was about to suggest that content creators pay a nominal fee for the content to be reviewed but if creators have no way to monetize the platform, that proposal is bijective with suggesting that Google just shut down YTKids. Or are parents paying already for access to YTKids and the reviews would come out of that?


The YT Kids app itself was never monetized to begin with. They only monetized viewing of kids videos on regular YouTube.


There was a post here a few months ago showing curated youtube content for kids. I'm sure a few of these exist.


The outrage machine would find something new to be upset about. Every week there'd be another channel coming out of the woodwork upset that they could not make the YouTube Kids cut.


Curation is messy, but that doesn't mean we should throw up our hands or, worse, think an algorithm can do it.

The argument here is whether imperfect curation is better than none, and the history of television tells us that it is. We never had a TV channel that randomly showed kids pro-suicide messages.


Not being able to review all the content is entirely a problem for Google to solve. Maybe they should stop optimizing for growth and put up some hurdles to who can upload content?

This sentiment is so shocking that I had to say something.

Please no. Let's not encourage a world where people are even less free to share their ideas.

I get where you're coming from, but "Think of the children!" has been used to justify so many evils throughout history. It's an emotional appeal that should be resisted.

Putting up barriers would also be the first step towards losing dominance. But in the short term, it would suck not being able to share a video without waiting a long time or being approved.


Putting up barriers on one privately owned website isn’t the same as making people less free to share their ideas. Particularly when the barrier might be something so benign as “to be included as kids content this video must be pre-approved”.

But won’t someone think of the poor content creators who might have to wait or upload their content somewhere else!


It's people's livelihoods. There is nowhere else.

"Think of the children" is equally bland.


So the issue isn't sharing ideas it's monetizing sharing those ideas?


This doesn't make any sense. There already exist non-user-generated platforms, as mentioned in the previous comments. If you don't like the downsides of the user-generated ones, why on earth would a reasonable reaction be "this shouldn't exist" instead of "this isn't right for me so I won't use it"? What a small, petty way to see the world.


if you don't like youtube for your kids, ban it and give them access to netflix and the like.

somehow youtube became the parent that oversees what content your kids get. this to me is a dereliction of your parenting requirements.


Expecting a service that has "Kids" in the name to have decent kids content is not a dereliction of parenting. It is full of low effort, addicting (to kids) videos.


You wouldn't need to moderate all the content. Just the content that has "safe for kids" stamped on it. If a creator says they are publishing a video "safe for kids" then it gets flagged for review before going live to YT kids. It could go live to general YT prior to that, which could preempt the review process if it was flagged before a moderator even got to it.

I think the issue for here is the YT is trying to claim they are providing a YT kids offering, comparable and competing directly with Netflix, Amazon, PBS, etc - if they want to make this claim and have parents allow their children to view it then they need to back it up.

edit: I am sure there are more checks in place and that the above idea has been considered. My main point is that they need to take responsibility for whatever claims they are making. If they can't back up the claims, then that is fine - just stop making them...


> They really don't have a choice at that volume.

Crazy idea - maybe no one should be operating at that volume until they can figure it out?


No, just because some kids are harmed so they should not operate at all? Thats crazy.


Maybe you can't, as a corporation worth 100s of billions of dollars, put it out there as being "for kids", if they don't want to pay to hire a legion of people to deal with what they can't scale. They don't get to reap the benefits and redistribute the costs.


Corporations (billions in revenue or otherwise) are not good substitutes for parenting.

If you don't want your kids on a product, then supervise them. Don't ban the product for everyone who can use it responsibly.


Use it responsibly? Are you talking about people uploading creepy videos targeted to kids, or like alcohol or something?

I'm certainly not one to say everything should be as safe as possible for kids, but there's a limit to what a parent can reasonably be expected to understand. If I set my kid up watching a blues clues clip, and look away for 30 seconds and he's seeing a clown with its arm torn off which autoplayed, I feel like I'm reasonably kinda entitled to go hey wtf google, your shit is broken and its harmed my kid.

And then, of course, not let my kid use youtube anymore.


It sounds like you are ignorant of the problem here.

YouTube throws up videos to children that look good until you get to the second of the video some psychopath has spliced into it. It’s literally not possible to safely put your kid in front of YouTube unless you have pre watched every video YouTube might potentially auto services up to your 3yo


If they were building kids playgrounds and kids are randomly getting hurt... yes. They should not be operating in_that_arena.



(Are you listening facebook, twitter, et al)


[flagged]


Not 'the internet'. The current version of the app that's marketed as for kids.


We can't allow any children to see any inappropriate content! Lets keep banning things until we make that true!


Nobody's banning it. They can go to normal youtube if their parents let them. You're tilting at windmills.


Oh I'm sure the law will be written in such a way that it'll be super easy to comply. Barely an inconvenience!

It won't stifle communication on the internet at all. Trust us.


Do you think it stifles anything to categorize movies as either G, PG, or neither? It's just a guideline that parents can choose to use or ignore.


You are talking about banning 'G', here.

The problem with excessive regulation is that complying with it can be extremely complicated and time consuming. If you don't have the money to pay for that, you either risk being fined or sued, you don't play.

It sets a bar that only already wealthy people can get over.


What law? I think most people are suggesting Google do this without the requirement of regulation to force them.


Sometimes I like to write my grievances on a slip of paper, and put it in a bottle. Then I throw the bottle into the ocean.

I hope someday someone will read it.


here's another crazy idea: if you're just going to hand over your kids to youtube then maybe your parenting priorities are missing.


I'm gonna strawman here real quick with a metal observation:

HN: parents need to stop being helicopter parents and kids need to have independence to grow in order to be functional adults, otherwise you're a shitty parent

Also HN: parents need to know everything their child does and know what they are watching at all times, otherwise you're a shitty parent


The big tech companies fundamentally do not give a crap. Traumatizing young children is a roundoff error in their calculus.


YouTube doesn’t have a god given right to the toddler market. They are the ones implying (through the existence of YT Kids and such) that they have done the filtering!

If they are honest about there lack of actually effective triage then this whole controversy is of a very different valence.

Stop pretending you’re triaging well if you’re not!


sure you can. Anything uploaded to YTKids goes through 1 of 2 processes.

1. it's from a trusted provider, send it on through. Add a complaint mechanism that makes it fairly easy for a trusted provider to be removed from this list (DMCA for content, if you will).

2. it sits in a queue until a human looks at it. Prioritize videos from those who were recently removed from the trusted provider list as a way to dampen the harm from abusive reports.

Suddenly, it's a much smaller problem.


Do it by channel. Either a channel is approved for children content by a human or it isn't. Then just rereview periodically. There's much less churn in channels compared to the content.


That wouldn't be good enough. The freaks who like to game the system with scary shit would just game that system too. They wouldn't care that they burned their channel.

The only way this works is to have a human review every piece of content before it goes live, like every other streaming service with a kids section does.


Why not just whitelist channels and disallow ones with less than say 500,000 subscribers. Surely nobody with that large of a following who produces kid targeted content would burn their own channel down just for the "lolz".


The fear is that unkid friendly video could get through so it would need to happen per video.


The there's too much to moderate argument for internet companies sounds like the too big to fail argument for banks.


they could do it but they don't want to because they don't have to.


But YouTube Kids will never play in my house again until I know that every piece of content on there has been reviewed by a human before it goes live.

I don’t have kids, but this seems a bit over the top. Children live in the world with the rest of us, and are exposed to things every day that haven’t been reviewed and approved as child safe. Angry drivers on the road may flip you off or yell at you while your kid is in the car, or someone at a stop light may be blaring music that you find to be offensive and not approved for consumption by children. Both of these things would objectively be scarier than a clown on a computer screen, yet kids see and hear these things every day and somehow most of them don’t grow up to be serial killers. Sure, try your best to shield them from harmful content, but completely shutting down something that is 98% clean, educational, child friendly content seems like throwing the baby out with the bath water.


You clearly have never seen the type of content psychopaths splice into child targeted YouTube videos or you would not be making this argument.

We are not talking about some mildly offensive or violent behaviour. We are talking about things that would be illegal to show on prime time TV (child and adult timeslots)

10 great child videos does not make up for 1 psychopathic murder scene spliced into video 11


OP made no mention of “psychopathic murder scenes”. He referred to clowns coming out unexpectedly, which seems in poor taste on the part of the video creator but is a far cry from a “psychopathic murder scene”. I am quite certain that if such a scene were spliced into any video I watched, even as an adult, I would never watch YouTube again, and I would make some mention of it on social media. We live in a culture where even the slightest faux pas sparks widespread outrage across social media, yet I have never heard of this being a significant issue. So it seems like this is not a common occurrence.

Have you ever personally witnessed a “psychopathic murder scene” being spliced into a YouTube video? Do 10% of videos really have this in them? I have never seen it once, in well over a decade of relatively heavy YouTube use. Most of my friends are in the tech scene and have children, and none of them have ever mentioned this as an issue either.


Maybe it's time for piracy and people power to take over. Parents curate for other parents, perhaps parents that know each other. Use youtube-dl and other things. Have an open source app (You might have to get away from iOS for this) on a tablet that can watch this community curated content.

wrt to piracy: keep in mind that parents are often relatively financially disadvantaged as well as being the most financially targeted group. Some parents turn to youtube rather than (say) disney+ for a reason. Those that can afford it do because it is less mind time and real time than having to do even community curation.


How old are your kids? I watch my 8 and 11 year olds on YT Kids fairly routinely, and while lots of the content is kinda junky and of questionable quality (so many toy unboxings and eat-a-disgusting-thing videos...) I haven't personally seen anything I'd call "inappropriate".

My complaint is the opposite: lots of good, sensible content (most recently I was trying to set up my youngest to watch drum cover videos) just isn't there because no one thought to put it there.


I'm not the person you're replying to, but my daughter was able to use my wifes iPad when she was turned 1. I decided we should 'baby proof' it, remove YouTube (along with a few other apps), and add YouTube Kids. In many ways YouTube kids was worse than YouTube, since YouTube would have shown our suggestions. YouTube Kids is full of addicting, low effort content. Think toy unboxing, surprise eggs, videos with colors. Nothing thought provoking or educational. They're mindbogglingly boring, but they turn kids into zombies. And they get outraged if you take it away.

You have to whitelist a few channels, then it works ok. Though my daughter is only 3 now, and would be happy watching Blippi all day. Seems like it would be more complicated to whitelist channels as kids get older.


> Nothing thought provoking or educational.

OK, that's just not true. There's a ton of great stuff on YT Kids (though as I mentioned not enough). But yes, you have to find it for them and show it to them yourself. Left to their own devices most kids aren't going to make good choices. They're kids, that's the point.

This is just the "be a parent" thing, really. There's absolutely no way to get around the need to educate your own children. Certainly Google isn't going to do it for you. And Google certainly didn't invent junky content; I remember staring at Hanna-Barbera garbage constantly as a child.


I didn't mean to say that there was nothing thought provoking or educational, just that YouTube wasn't showing it. It shows high view count, mindless, addictive videos. Even if you start with a good one.

The funny thing is, I had to choose my child's age range (or enter the birth year or something) when setting up the app. Then, when I went to whitelist channel by channel, it had a bunch of recommended channels full of good/decent content. That should have been the default whitelist out of the box.

It isn't just about "being a parent". My daughter is very smart for her age. My wife used to be a teacher, but since we had kids, she stays at home with them. She has taught my kids a lot. She has learned a good amount from content on YouTube, Netflix, and Disney. I think it's good that she can use an iPad better than my parents (even though she can't read). She also can solve puzzle games on her own. Of course we guide her and limit her usage. My point is that an app marketed to kids should not be full of garbage content.


Well, according to the FTC and Google, adding the video to YouTube Kids means never getting your video recommended or added to playlists, no comments or subscriptions, and no ad revenue. Who would want that?


Dare I ask why you put your guards up against YouTube instead of either deciding that your kids are too young to use a tablet/smartphone/PC in general, or that sometimes contents is scary but it's OK to talk about?

Ps: I'm not trying to change your mind, I'm trying to understand the reasoning behind it, coming from a more liberal raising and as a result a more relaxed attitude to what is banned and what is not banned.


You clearly haven't seen some of the horrifying stuff that slips past the filters. We're not talking about the episode of Leave it to Beaver where Wally smokes a joint. There is a category of malicious amateur cartoons that look kid friendly in the thumbnail and in the first few minutes, and then become sick. The Elsagate stuff was part of that. I have seen, on YouTube Kids: an ice cream truck driven by a cartoon animal, who poisons and kills the other cartoon animals; and a knock off Mickey Mouse cartoon where Mickey rips his own head off unexpectedly with full blood and gore. This is not "have a discussion with your kids" territory, it's nightmares and trauma. As a parent, the risk of that happening obliterates the value proposition of YouTube for kids.


> We're not talking about the episode of Leave it to Beaver where Wally smokes a joint.

As a big Leave it to Beaver fan I find this hilarious. And would pay anything to watch this episode if it existed.


Well, that does not sound more horrifying than say Itchy & Scratchy from The Simpsons.


That is not a cartoon for young children though. It is rated TV-PG and even TV-14 on some episodes.


> This is not "have a discussion with your kids" territory, it's nightmares and trauma.

The big tech companies are so far removed from real users that they'll just tighten a few screws on their machine learning and call it fixed. You're a roundoff error to them, unfortunately. These people are the real machines, servicing the electronic machines to make money for their masters, the execs and stockholders.


I have seen it, yes. The sudden gore is obviously unfortunate for a toddler to stumble across, even for an older child. I understand that you would like to shield your children for such content, but I also strongly believe that it's possible to do so with communication, rules and strict guidelines on how to look up content and suggested videos.

"As a parent" is one of my favourite phrases, by the way. It translates to "I know better than you, regardless of your parental status". Which is fine. I was only wondering about the reasoning, after all. :)


> "As a parent" is one of my favourite phrases, by the way. It translates to "I know better than you, regardless of your parental status".

I sympathize with this. Lots of parents are vociferous morons, because lots of people are morons and becoming parents does not fix that. But: There is a real difference, at least for many people. Your brain rewires itself to be hyper-vigilant about child safety and child harm. I can't watch TV or movies anymore that involve harm or the threat of grave harm to children, but these wouldn't have bothered me before. When I enter a room, I subconsciously scan it like Jason Bourne if he were a toddler's bodyguard. Slippery surfaces? Sharp table corners? Unfenced stairs? I've compared notes with friends and this is a common, if not universal, change in parents. They spend a lot of time thinking about child safety.


I think the OP is saying the same thing you are, which is that smoking a joint is not the problem: it’s the random gore.


I sure did miss that "not".


You seem to have misread the parent comment.


I think it's pretty fair to expect that a product or platform like YouTube delivers a consistent and predictable experience instead of blaming users.

Yes, if you are worried, the smart thing to do is not to expose your kids to YouTube at all. Wishing that YouTube could be better is independent of whether you choose to expose your kids to it.

Also, it has nothing to do with being liberal or conservative, at least with me. I will let my kids watch a lot of stuff, if we intentionally pick it out. If you saw random baby shark or finger family between whatever videos you wanted, you'd be right to complain and no one would blame you.

There is zero product design behind the way kids are exposed to scary things. No product manager decided it should be that way. It just is that way because it's not prioritized by the business or it's too expensive.

Imo if YouTube markets at kids they should be accountable. They should clean it up.


> Dare I ask why you put your guards up against YouTube instead of either deciding that your kids are too young to use a tablet/smartphone/PC in general, or that sometimes contents is scary but it's OK to talk about?

My kids are too young to have unrestricted access to the internet. That's why they are limited to specific kids apps.

But YTKids is different than all the other kids apps. All the other apps have a human check all the content before it goes live.

So I can be reasonably certain that the content will be ok without watching it all first.

With YT, I can't have that certainty. So they just don't get to use that platform anymore.

Yeah, bad things happen, and when they do I talk to my kids about it.

But I don't want to purposely subject them to such things if I don't have to.


Dude the stuff we're discussing freaks me out just looking at it, and I'm a grown adult. If I'm watching a horror/gore movie I'm fully prepared for everything thats coming, if this kind of thing randomly appeared while watching a video about how to make spaghetti and meatballs with no acknowledgement (as it tends to happen in these kids videos) It'd fuck with me big time.


Not the OP, but we tried that with my daughter. It turns out that’s only a viable strategy if you’re willing to absolutely never use a phone or tablet or computer or tv in their presence, ever.


Why is that? My daughter has an extremely restricted access to screens. I code for a living. She understands that is my work. She understands we consider she is too young to decide the contents she watches. And even when she doesn't understand, she understands it's not her call.


You have a very understanding kid then, I'm jealous.

We let ours decide what to watch / play out of a handful of things we've hand-picked as OK (such as Daniel Tiger's Neighborhood or Super Simple Songs), and iOS's guided access mode has been a godsend for this.

Super Simple Songs is on YouTube and has great content for kids, but I refuse to use YouTube Kids as we tried it once and couldn't control it well enough—weird videos started creeping in almost immediately. So, we use regular YouTube and disable autoplay, which we have to manually check every single time, because YouTube has randomly re-enabled it for us a couple times which led to the weird videos again).

We do limit her screen time (and consequently, ours), but trying to keep her away from them 100% of the time? That's honestly not realistic, nor fair.


This. I was also not allowed to watch certain TV shows if they happened to air at a time where I wasn't put to bed yet. And don't get me started on magazines in the supermarket. I didn't understand why not, but I knew that disrespecting the rule was not worth getting my curiosity satisfied.


You were still allowed to watch TV shows then, I presume? Just like pretty much everyone I know? Our daughter is too, and we absolutely control what she can and can not watch. But in HN terms, YouTube is a massive foot-gun compared to other services. Auto-play very quickly leads you to trouble, and you can never quite trust YouTube to not re-enable it, or show thumbnails for inappropriate content at the end of a video.

At our daughter's previous day home, the provider started letting her watch a YouTube channel called babybus (against our previous agreement of no TV, which was the straw that caused us to quit her service). At first glance the show is innocent enough, it's just a bunch of cartoon characters singing and going on adventures. But if you actually pay close attention to some (not all) of the episodes, its a weird fear-based morality show from China that threatens death if you don't behave a certain (rather authoritarian) way.

The frustrating part about it is that there is a ton of really great kids' programming on YouTube that simply wouldn't be possible without it. It just takes an order of magnitude more effort to sort it out than pretty much any other service that provides content because you have to curate everything yourself.


Honest Question: If you did use a device and disallowed the same for your daughter, did your daughter "freak out" in some manner?

I ask out of curiosity and personal learning, not out of any sort of judgement. I do not have children and if I did would likely not have had so in time for this to occur.


It's not quite as simple as you've laid out, but sometimes it can be, depending on everyone's moods. Sometimes she's perfectly fine playing in her play kitchen while we watch a 20 minute show, other times she wants to watch a nursery rhyme and sing and dance along, or watch Lucas the Spider while she cuddles her stuffed version of him.

You don't realize just how much free time you really have until a toddler demands it all away from you. Meanwhile, dinner needs to get cooked, dishes need to be done, laundry has to be washed and folded, the house needs to get cleaned, etc. And your kid wants / needs attention every few minutes (at least ours does—she can only colour or paint or play with toys for so long by herself before she wants to play with you).


And not just the content itself. Often enough the commercials are about movies brandishing guns and shooting folks. What in the f’ing f…?

First thing I do on a kids device is ban the YouTube and similar apps and set web to a whitelist. Check the screen time settings.


YouTube still has some amazing content for kids.

My solution has been to curate specific videos then embedd them in html pages and turn off the suggestions that would normally run at the video.

Yes, it’s more work, but I see real benefits in this approach.


same - too much curated content available to let the kids watch random YouTube videos


I don't understand that thing with the scary clown jumping and screaming, or any other shocking content like that. What do they earn with those videos? Or are they just psychopaths?


It's entertainment meant for an older audience than small children, I suppose.


The creator deciding isn't human curated?


Not when talking about platform curation.


Fear of something that is not real like a scary clown could be seen as a good parenting opportunity. There are crocodiles.


We're dealing with two year olds here, you know, the type that doesn't even have the ability to run away from danger even if they're able to recognize it, and even if they do they wouldn't be able to do much about it because, again, they're small children.

They're children, not small adults.


You can use the YouTube Kids app which has been hand curated. albeit, it's not as good yet but at least it's safe.


YouTube Kids is what I use and it is most certainly not hand curated.

Or if it is, they need better hands, because they're terrible at it.


> they are leaving it up to the creator to decide if it's kids content. It's still not human curated.

The word for this is "parent".


Unless you are able to watch them every waking hour, this is simply not possible. At some point you have to cook, clean and generally do all the stuff that needs doing without worrying too much about your child hurting itself.


What generally happens is you set rules and teach what is acceptable or not. When you are not around the kid is expected to follow the rules.

Many kids do not follow these rules. They get punished and have valued things taken away. Some learn a lesson and will follow the rules and others do not.

The ones who still don't listen have to be with you when you cook and clean. If you go out another parent/babysitter/old children will watch them.


You’re missing the point, I believe.

What rules can you set in place such that your kids don’t accidentally see something explicitly targeting them with gore or other traumatic content?

The only rule that seems to make that possible is “don’t use YouTube Kids”.


Unless you are able to watch them every waking hour, there's nothing preventing your child from clicking on a video which is NOT labeled child-friendly.

It is not Youtube's responsibility to raise your children.


> I used to let my kid watch YTKids while I supervised.

I think this is where you lose me - why is allowing a child access to have un-tethered access to a content portal appropriate? i.e. why is the responsibility on the content portal (YouTube) and not the parent or supervisor?

I might be a tad too much on the individual responsibility side of the spectrum here, but just like the old argument from the 90's ("The TV ain't a replacement for a babysitter...") I'm not sure the responsibility necessarily needs to be on the platform here over the parent/supervisor.

One suggestion: You can create a white-list or playlist of pre-approved content creators and/or videos, and offer access that way.


It’s easier and more fool proof to just not use it. That’s the point - their service isn’t working and it’s better for many parents to just tune it out.

That’s parenting - avoid the shitty services that can’t put out a good product so you can focus your effort on something more meaningful. Or YouTube can fix the problem and it will be allowed in my household.


We generally trust that every page of a kids book will be roughly as expected given the title and author of the book. We trust the same of things like PBS kids programs.

The same trust cannot be extended to YouTube.


The TV can't replace a babysitter, but it's still reasonable to expect content labeled for kids to be filtered.

If YT can't and won't do it, they shouldn't label it as kids content. They have a separate product/brand for it, so it should be what it claims to be.


And the expectation is absolutely there because the YouTube Kids app is filtered but does a terrible job of it.


>why is the responsibility on the content portal (YouTube) and not the parent or supervisor?

because given how easy it is to access content on the user side there's no other way to do it feasibly. You can take the tv remote away from your kids, and that only turns the TV off. Tablets, pcs and so on are general-purpose computers with plenty of workarounds, it's extremely hard practically to stop kids from accessing content that they can technically access without greatly diminishing the value of the device or service.


Whatever the intent here was, it's having a completely bonkers outcome, and a real chilling effect on content creators who have done no harm.

Content creators, who are not intending to target children with their videos, are taking steps to ensure they're not ensnared by the onerous definitions the FTC has given for "children's content." The FTC has indeed explicitly said that the content creators, not YouTube, are responsible for flagging their videos as "for kids," and the content creators are personally liable to pay a $40,000 fine if the FTC thinks they made a mistake [0]. So videos where adults dress up in costumes, producing videos for other adults, but that a kid might like? Yeah, they don't wear those costumes anymore, because the FTC might consider it targeted-to-kids.

Meanwhile, actual garbage aimed specifically at children, like Elsa-gate and animated shorts with instructions for viewers to commit suicide, go untargeted by this non-fix.

[0]: https://arstechnica.com/gaming/2020/01/the-ftcs-2020-coppa-r...


This is going to be ugly for people who do (did?) family-friendly content such as a lot of gaming-related stuff. I'll use as an example Trainer Tips, which is an intentionally noncontroversial channel about Pokemon Go.

As I understand it, if he marks his videos as being for kids, they end up effectively demonetized and comments aren't allowed. If he doesn't mark them as being for kids, he's at risk of massive fines. If he marks them as adults-only (or just starts swearing or the like), they're also effectively demonetized.

Guess this explains all the push by a lot of creators towards Patreon and the like - the simple solution is to have your videos on YouTube marked as for kids, but to push for people to support you on Patreon to get access to discussions, a channel-specific Discord, etc. for the kinds of things that would previously be in comments.

Edit: Oh, and no notifications when new kid-friendly content is posted, so if you want notifications 'Support me on Patreon'


> As I understand it, if he marks his videos as being for kids, they end up effectively demonetized and comments aren't allowed.

Source?

The blog post from Google mentions no longer serving personalized ads, which sure is much less effective, but it's still monetization.


I would expect that any non personalized ads being served on videos marked as being for children will also probably be targeted at children. I suspect that significantly reduces the pool of ads and also likely restricts it to much lower ad rates.


> I would expect that any non personalized ads being served on videos marked as being for children will also probably be targeted at children.

In a weird coincidence my SO told me a few minutes ago about how Youtube is cruel right now by showing her DisneyWorld ads. It's clearly happening because she watched a bunch of "Into the unknown" video clip from Frozen 2 recently.

It made me question 2 things, do they already implemented theses changes, because to me, any content from Frozen 2 is targeting kids, but more importantly does the COPPA thing apply to ads.

If it does apply to ads, any Disney ads are clearly targeting children, so any decision to show theses based on tracking should point toward tracking that target children.


they’re just signing up to get fucked all over again if they move to patreon.

we need good decentralized platforms, no?


It's not so much Patreon as some sort of outside discussion, notification and possibly discovery platform with YouTube reduced to serving media instead of discovery. That outside platform would likely need to be membership based or paid, probably both, because that "paid" bit may provide a lot of cover against children getting access - at least on the basis of T&C that say something like "as part of this membership you agree to not allow access by any persons under 13 and agree to cover fines and legal expenses we may incur as a result of you violating these terms or as a result of collection attempts."

That would be a combination of "get us in trouble and you're in it too" and "as you can see Your Honor, we've made strong attempts to enforce keeping children out of the community."


any centralized platform that gains significant traction will inevitably fuck w/ some of its users for their own ends. i think as a creator either owning the platform yourself (impractical for almost every1) or something decentralized would be ideal. that way you don't give anyone a veto over your destiny.

to replace youtube, it'd have to be both decentralized video delivery, discovery, social as well as decentralized monetization -- which i think could be patreon style or mayyyybe some sort of ad delivery.

not sure what legal ramfications would be exactly, but i certainly hope we aren't in a place were otherwise-valid enterprises would be banned unless you could (impossibly) prove kids weren't using it.

(also there have historically been lots of people making $ off providing targeted-to-kids content (toy unboxing, slime videos, tasting candy, &c), and it's not clear to me that's a bad thing...?)


Honestly the issue is how do you pay for a YouTube without ad dollars or it being the pet project of some billionaire benefactor? Youtube itself is reportedly "roughly break-even" even with all the ads and monetization without having any proper support for the people making the content. [0]

[0] The number of creators I've heard get within a few days or a week of permanently losing their account because of some weird combination of things and only got it back because a friend of a friend, not their official YT liaison even, was able to find out what was wrong and how they needed to fix it.


> Oh, and no notifications when new kid-friendly content is posted, so if you want notifications 'Support me on Patreon'

How about RSS? Each channel has RSS, and iirc it has only failed once (as in, a few days with no updates even though there were new videos) in the last 4-5 years. I don't use Youtube's system for following channels anymore, I just have a YouTube 'subscriptions' category on my feedly.


Sure there are options, and technically savvy viewers can use them. On the other hand, how widely used and recognized is RSS after Google managed to embrace it, kill the competition and kill the then-dominant RSS reading platform?

"Have your viewers use RSS for notifications" is going to work as well as telling them to have all their viewers use Opera or IE.


>The FTC has indeed explicitly said that the content creators, not YouTube, are responsible for flagging their videos as "for kids," and the content creators are personally liable to pay a $40,000 fine if the FTC thinks they made a mistake

Can you clarify on this? So, you're saying that the FTC will punish content creators who _don't_ flag content as "for children" if it winds up being "targeted-to-children" via some subjective notion regardless?

Those videos wouldn't show up on YT Kids, correct? So they're not going to show up for kids anyway...right? That sounds like the way things should work.


My take is that the FTC is operating from a perspective that if you create content that's designed for children, and you put it on normal Youtube without labeling it, you're sort of "tempting" children to come into an unsafe environment.

In the Ars video, the FTC chairman says the following (16:30):

> "On the Internet though, you don't know who your users are. So what the COPPA rule more specifically does, is, 'where the content is directed to children, we are presuming the users are 12 and under.' [...] If however you are a general-audience platform like Youtube, you don't know what the content necessarily on your platform is. [...] That's what we mean when we talk about strict-liability for content creators."

To me, I interpret this quote (as well as the rest of the video) to say that the FTC does not care whether or not the platform is directed towards children. Keep in mind that Youtube is already officially designated as not directed towards children under 12. If your child is using Youtube, you are violating their TOS. My personal interpretation of this video is that the FTC is saying, "we don't care if the platform is for children, if the content is for children, the rules apply -- no matter where you put it."

As to whether or not content creators should be liable for Youtube collecting data that the creators don't even have access to, that's another question. But the FTC doesn't seem interested in asking that question. It's pretty clear to me that the chairman views creators purely as a means to getting at Youtube, regardless of what the consequences are to the individual uploaders. Creators are the "fish in a barrel."


The FTC is specifically targeting videos on the main YouTube site, not YT Kids, (because they claim kids might be on the main site anyway, regardless of the ToS forbidding children) and their criteria for whether they consider it children's content is a set of fairly broad guidelines outlined in the article in the post above.


This is Game Theory’s explanation of it:

https://youtu.be/pd604xskDmU


I've already had two channels my kid likes to watch stop offering kid content. They've totally changed their content, and its nothing he enjoys. It sucks.


They're trying to automatically detect content that's aimed for kids. So creepy Elsagate stuff should end up being flagged.


I'm assuming you're referring to this [0].

Looks like the two options are either the video is removed, or put behind an age gate.

Isn't the FTC's issue here that they simply don't believe YouTube when it comes to an account's age? A kid could be watching the targeted-to-kids content and still be on their older sibling's account, or they may have lied about their birthday (who didn't?) - either way, their argument is that YouTube and content creators are violating COPPA if the content itself is what they consider targeted to kids, and a kid watches it - regardless of what YouTube claims their account age to be.

So, putting the creepy videos behind an age gate - seems to not be a valid defense in light of the FTC's reasoning that age gates are not reliable. Which is why their silence on this side of the issue is confusing to me.

Unless there's some kind of tunnel vision going on here, and they're worried about the ad metrics or whatever that YouTube is generating from minors - and the FTC just doesn't care about the actual content of the videos, so long as you're very careful about what data you collect and how you monetize videos a kid might watch.

[0]: https://youtube.googleblog.com/2017/11/5-ways-were-toughenin...


Think about how little actual user value that youtube itself provides, except from the network effect. Its nothing but rent seeking. IMO if they're going to be landlords, they need to fix the appliances.


I'm sorry, but this is a bad take.

Of all the free services available online, YouTube is maybe the single worst example you could pick as far as "offering little actual user value".

The fact that any user across some 76 languages can upload any number of up to 12 hour videos that are then transcoded, captioned, and shareable FOR FREE is absurd.

If those videos never get a more than 10 views, it's still hosted and accessible FOR FREE. If they blow up and get billions of views, the service scales to serve that FOR FREE.

If you get really popular, you can CHOOSE to show ads on your videos and get paid for views while still taking advantage of the free hosting and infrastructure. It's a goddamn miracle that individuals creating and publishing their own web videos has become a viable occupation. That didn't exist before YouTube.

Even more impressive is spending money and resources hosting all of the videos that will never be viewed. A user here posted a website that randomly played a continuous feed of never before viewed videos[1]. It's astounding.

All this while also operating the second largest search engine in the world to help viewers find your content for free and being so reliable as to make global news when it goes offline for some users for an hour[2].

I can understand criticizing some of YouTube's policies or interactions with mega creators, but I legitimately can't comprehend calling their contribution "little actual user value".

[1] - http://astronaut.io/

[2] - https://www.cbsnews.com/news/google-cloud-outage-hits-youtub...


Here's the problem I have with google or the big platforms and why the "user value" I think is overwrought:

They've pushed everyone else that could compete out, because they've achieved such economies of scale and vertical integration that they can afford to build many products AT A LOSS, because only at their scale can one actually make money on the product.

A $50 google home speaker, or amazon alexa should be $200. They're selling it to you basically at cost so they can get your data. Similarly storing all those youtube videos - it costs them money. This deal isn't free for the consumer, its a tradeoff that you HAVE to use them, the company store.


Well aren't the bandwidth costs for hosting a video that gets like a million views or whatever reasonably expensive?


Sure, but their revenue running ads is quite high to compensate.


Some of the features that get disabled for "children's content" are a bit surprising:

Playback in the Miniplayer

Save to playlist and Save to watch later

Likes and dislikes on YouTube Music

Donate buttons

I guess it's hard to do these in a way that is compliant with the privacy rules, or they're worried that kids will donate $1000 to their favorite Youtuber with Mom and Dad's credit card? It will be interesting to see how good their machine learning is at identifying "kid's content". From reading the FTC page, the delineation seems a bit arbitrary. I suppose we'll see if there's a financial impact on Youtubers that are borderline (is someone playing a video game considered children's content? What if the videogame is rated M? Will the tiebreaker be statistics about their actual viewers?)


Unintended purchases are a real issue. I've witnessed first hand how young kids tend to press anything on the screen without any thought. I saw them press and nearly purchase youtube's "Super Chat, a tool that lets you pay to pin comments on live streams" without knowing what it was.


Especially if it gets in the way of the content they are looking to use/view, they are just clicking buttons to get things moving and then next thing you know your google account bought an app.


Do you guys seriously put your kids on an account with credit card access? That sounds like a recipe for disaster.


All that needs to be true is that enough people make this mistake to warrant the UI change. No condescension or lecture needed.


“Enough people” does not make a questionable act less questionable


Unless you're giving them their own device it's easy to make a mistake and have an account with a CC accessible.


Even with a full password for confirmation, or without a CC, you can still see them click the link. The behavior is the same, and the reason that takes you to a purchase page is because someone is doing it.


I don't understand the prevention of adding to a playlist. Now I can't make a custom playlist of children's content for my kid?


Maybe this is intended to prevent "YouTube babysitting" aka "make a long playlist, park kid in front of youtube, go and do something else"? Without the ability to schedule up hours of content, one is forced to look after the child regularly, even if it's just to start the next dull bit of entertainment.

Though if that was the motivation I'd expect them to also disable auto-play of next video for content targeted at kids.


> Without the ability to schedule up hours of content, one is forced to

watch hours of YouTube recommended up-next videos!


I guess Youtube is covering itself from someone alleging custom playlists is a form of tracking.


They're trying to avoid issues with COPPA, which basically says that they can't store user data for anyone under 13 without explicit formal consent from a parent (with some really complicated exceptions and ways to comply).

https://en.wikipedia.org/wiki/Children%27s_Online_Privacy_Pr...


And they consider a playlist made by an adult that contains a few videos for kids to be data on a child?


They're trying to shut down all their possible COPPA liabilities because they can get incredibly expensive very quickly so any data on made for kids content they're treating like it's data on kids. Also skimming the FTC's FAQ it sounds like if you're serving a web page targeted at children you have to do a lot more so google having channels aimed at kids puts them under the stricter standards about ensuring kids data doesn't get tracked.


It seems as if the default assumption for all kids content is the "kid" did it, and hence assumed they also made the playlist. Perhaps features will be added at a later date to re-enable features such as playlists with default implicit parental consent.

I rather like the hard handed approach as a parent (who can be both lazy and diligent) as I think it reinforces good parental control of video content.

This will force lots of behavior changes for both consumers and producers. The key will be how they handle the suggested videos, which may have unintended negative consequences, as I am sure the UX will still be based on generated maximum ad dollars, targeted or not, and not "healthy" video viewing.


My uneducated guess might have been the easy implementation of disallowing both children's and non children's content in a public playlist.



Makes me think that youtube can't avoid putting a playlist into their big recommendation learning algorithms, or something, so it's easier to just disable the whole feature.


You could use local browser bookmarks. No privacy issue with those!


Having a favorites list that follows you across devices is nice. Being able to publish a publicly viewable playlist is nice.

Also, on mobile that would require you to exit the app and move to a browser. Figuring out how to pull up a video from the app in a browser requires a certain level of tech fluency. (You have to know to hit "share", "copy link", paste it into the browser, then somehow force the browser to not punt you back to the YouTube app.)


I think we'll find that distinguishing between "children's content" and "family-friendly content" is difficult not just for YouTube's algorithms but for humans as well. I suspect we'll see even more channels get hit by this simply because they're not identifiably "adult" in any way.


million dollar question, is anime for kids?

Guess we will see.


Yeah. For example, I don't find Veggietales to be family friendly. Its a minority opinion, but I can't get it removed from my kids profiles. So now I have to talk to my children, ugh.


I mean, most of the shows for kids look extremely creepy and unsettling to me.

The introduction of a world where everything is 'fair' and 'nice' controlled by 'authorities' is something I notice often in those shows. The faces are big and stretched out from reality as a distraction from the theme. Some of them encourage weird experimentation that can go wrong easily while pushing for odd stereotypes. Good guy beats the bad guy with violence... punitive justice.. fair bit seems sexual, promoting eating disorders, creepy parental expectations and misleading nutritional science.


I watched veggietales in school as a kid, but I don't remember much about other than vague recollections of a joseph and the many colored coat adaptation.

I'm just curious, what do you find not to be family friendly about the show?


Not the OP, but if you're not Christian, that show might not be friendly to your family as it is explicitly targeted at Christians or people the shows creators hope will become Christians.

So that may not go over so well with, for example, a Jewish or agnostic family.


ah okay, makes sense. I always thought the biblical stuff was just a fraction of the stories they adapted.


Propagating Christianity was an explicit goal of the show:

"Their aim was to produce children's videos which conveyed Christian moral themes and taught Biblical values and lessons."

https://en.m.wikipedia.org/wiki/VeggieTales


I imagine they mean that they don’t consider a show that teaches a particular religious tradition as fact to be appropriate for children outside of that religious tradition.


Are those perhaps all features that require either 1) to be logged in, or 2) to use a more complex version of the site that requires cookies?


CYA. It's either that or denying kids access to youtube altogether and that is very hard to implement, so better to disable those features that might lead to legal exposure.


Google seems to opt for the nuke it from space option as a 1st response. Let god or whatever sort it out.


YouTube is politically popular to attack, so they know they aren’t going to get the benefit of the doubt. And when regulators come after Google they seem to like aiming their fines in the billions of dollars range. So it seems prudent to respond to any inkling of having given a regulator an excuse to investigate by burning the entire thing down and vowing never to go near the wreckage ever again.


The suspicious thing is YouTube tries to detect children's content, not users who are children… so this means YouTube is doing ad targeting on what it believes to be children so long as they're not watching “children's content”? Or such is the impression I got from this excellent essay about it: https://www.youtube.com/watch?v=LuScIN4emyo


Well COPPA (the bill that caused this) is targeting companies that are creating content targeted at children, not targeting the children themselves. And it is the content the bill is interested in.

The observable (and controversial) effect of this is an outcry by content creators, I suspect the measures they take around accounts identified as children are much less obvious and entirely orthogonal.


I suspect Google doesn't want to tip their hand and show just how unreliable their demographic data is. There's a perception among the public that these companies siphoning up so much data must be able to know everything about individual users, and in fact Google's largest revenue stream relies on this perception. If I'm right, then when the house of cards falls on this Big Data Bubble it will be painful for a lot of tech companies, Facebook and Google especially.


>I suspect they don't want to tip their hand

I think that's a bridge too far. Unfortunately, I don't think anyone would actually want a world where ad-targetting data could be used as a form of ID to prove your age...

If you want the data to be as good as an ID... then everyone would need to offer some kind of government issued ID in order to create an online account, in order to make sure they weren't a child. Anything short of that, and you're going to have accounts that porport to be 21 when the 'real' user is only 12.

Or maybe you think they should use Machine Learning? So you then monitor every youtube video the 21 year old user watches, and determine that they are watching My Little Pony content over and over... so surely they must be a child. And then lock down the account and prevent them from making comments/purchases/etc?

The child protection laws are very strict. I don't think any of us would like a world where you get treated like a child (e.g. no right to view or participate on most of the internet) because of some algorithm. Civil suits against content creators seems like one of the only viable ways forward to me, as stupid as it is.


I don't think any of these things, no. I think content, tracking and engagement restrictions should be performed on the Youtube Kids app, and that parents should be responsible for ensuring their kids are using the appropriate apps/accounts. I think restricting features based on the content of the video is doomed to failure, as many have discussed in this thread.


Tangent: What does it matter how reliable your demographic data is when you're doing digital marketing? At the end of the day what matters is your ROI from placing ads; whether they are shown to the demographic you had in mind or not is more of a detail (you might think 24-30 women should be your best demographic, but maybe the real decision function is non-linear and more fuzzy algorithms get you a better ROI)


Determining ROI in marketing is a famously hard problem. As the old saying goes, "Half the money I spend on advertising is wasted; the trouble is, I don't know which half".


Fair, Google may not trust their inferences to the point of them becoming a legal liability.


Do you believe that relying on their creators to be honest about the content is a better bet?


I would imagine Google's legal team would answer that with a resounding "yes."


I'm not talking about Google's legal team. We already know which way their going due to the article. I'm talking about reality.


Ok but the post you replied to was speculating on Google's motivations, and therefore talking about legal liability.

I guess if you are speaking of "what is best for society", ok, but that's a completely different thing. It sure seemed like you were debating the post above, but I guess not.


> Ok but the post you replied to was speculating on Google's motivations, and therefore talking about legal liability.

It was my own post and it had nothing to do with legal liability. As I said (clearly, I thought), I think Google is reluctant to show the world just how unreliable their demographic data is.


> The suspicious thing is YouTube tries to detect children's content, not users who are children…

Theses whole changes are happening because they were tracking childrens (well they were tracking everyone, which included childrens). They can't store any data from a child, none at all. That means that they can't detect if it's a child, and even if they had a magical way to do it, storing the "isChildren" variable set as true would be illegal in itself.

COPPA decided that a video targeting child was a proof that they knew child were watching, and thus tracking them. So instead they move the responsibility to the content creator to say whether the content is for kids or not (thus not storing data on the child, while still being able to not track the user which is now assumed to be a child).

The only way they can store data about a child require parent permission.


The problem is there is no good way to tell if a user is a child without violating their privacy. You can't ask their age, they'll lie.

Targeting content aimed at children is... I'm not going to pretend it's good, or even long-term viable (malicious videos are still going to slip through). But it passes the extremely low bar of being better than age-verification gates.


How would YouTube know the user is a child?


Either they have a children's account (https://support.google.com/families/answer/7103338?hl=en)

Or

Google figures it out the same way they infer any number of things about you and use it to resell advertising. If they can figure out someone is a woman between 19 and 25, I'm pretty sure they can figure out with a good degree of accuracy that a user is under 15.


In a similar vein, if it's illegal to do enough tracking on a 12 year old to identify them as a 12 year old, and 12 year olds exist, then it's illegal to do any tracking, right?


That's not my reading of COPPA (but I'm no lawyer). Just because you can't individually track a user, doesn't mean you can't store a cookie with a single flag for "under13=true" or something like that. COPPA is about personal-information, not pretending children don't exist at all.


I'm ok with this myself. The web would not stop working (or even noticeably degrade) if all persistent tracking was illegal/ disabled.


You could just turn the negative-detection into positive detection: sell hardware that you have to be 18 to buy, and then provide your service only through that hardware.


Parents will buy said hardware for their kids.


Well, that's fine, then; with enough clear labelling, that's legally entirely the parents' fault.


Google already requires accounts with a birth date indicating they are 13 or older, thus not subject to COPA, to verify their age via charging a credit card or sending in a copy of a government issued id. Any kids that were watching either stole a credit card or their parents created the account for them. The FTC decided this was not adequate, so I don't expect they would deem your scheme compliant either.


What? I don't remember sending Google an ID when I last created an account.


News flash! They already did buy specialized hardware that you need to be over 18 to buy! It's called a smartphone with a data plan!

There is pretty much 0 way that you could say that purchasing a universal communication device is carte blanche for a child to view any content online.


> specialized hardware that you need to be over 18 to buy

Not always. Minors are allowed to have cellular with parental consent in some cases.


As far as I know, no website in the world uses ML to decide that you are under 13. They can't by law, afaik, though ianal. My reading of COPPA is that you need to have the user tell you they are under 13, or via a parent. If you think they are a child based on websites visited, or whatever? That's not "Actual Knowledge" that it's a child.


That sounds to me exactly like what SuperAwesome's KidSwitch does:

https://news.ycombinator.com/item?id=20011860


Oh they don't. They absolutely completely have no way at all of knowing. Just ask them, no way of being sure.

Course the data is all there, but they just really promise they never put it together and knowingly went after kids.


Assume, just for the sake of the discussion, obviously, that it may be possible, using some certain commonly accepted methods, and with some reasonable, commonly accepted margin of error, to detect that a user, signed into an account, is NOT a child.

Will this help?


Depends on whether you're willing to tolerate false positives.

To start with, I'm not a huge fan of the idea that the Internet as whole could be legally obligated to assume I'm a child unless I provide every site I visit with a drivers license, for example.

But let's take your assumption to the next level: let's assume that there's some privacy-respecting way of doing this (I don't believe there is, but we'll temporarily pretend it exists).

I'm also not a huge fan of the idea that the Internet as a whole could be legally obligated to assume I'm a child based on criteria that are outside of my control. I personally think most of our experience with filters online have taught us that they're best implemented close to the consumer, not close to the source.

To be clear, both Youtube and the FTC are turning this into a dumpster fire. But the idea of filtering content that's designed for children is better than the idea of filtering consumers. It's just that this implementation of that idea is garbage and unnecessarily hurts creators.


Filtering? It’s a discussion about privacy....

(What content/functionality is available to what user is a different question, and, I think we both agree, should be left for the provider to decide.)

(And I always find internet-as-whole arguments as a straw man. We are taking about YouTube and Kids. Nothing more.)


Youtube and Kids are making these decisions based on FTC requirements. The FTC's rulings here have implications for the Internet as a whole.

> Filtering? It’s a discussion about privacy

The point being, there's no way to determine whether someone is an adult without violating their privacy, and even if there was, the false positives would be (imo) an unacceptable tradeoff. If you want to have a version of COPPA that works without violating privacy, you have to focus on either labeling or filtering content that's uploaded to a service. Trying to filter users based on their age just doesn't work.


Assume, just for the sake of the discussion, obviously, that it may be possible to determine, without violating their privacy (and actually without doing a damn thing), that someone might be a child?

Will this help?

(There’s an elephant in the room. All-of-internet wants it straw elephant out (I think). The lawyers just got confused.)


You will still have false positives. You will either have false positives in that children will slip through, which the FTC is not a fan of. Or you will have false-positives in that adults will be unfairly restricted (which everyone else should have a problem with).

Even in that scenario, content filtering is more feasible and practical than detecting and banning children.

But, pushing that problem aside as well, sure. Assuming that you didn't have false positives, and assuming that you didn't have to violate anyone's privacy, and assuming that the solution wasn't obscenely expensive, that theoretical solution would be very helpful. It would basically solve the entire problem, for pretty much everyone.

But that theoretical solution doesn't exist.

If we're willing to make those assumptions, I would also like a special gun that only shoots bad people and doesn't work in schools or churches. And it would be nice to give law enforcement with warrants access to lawful information without requiring encryption backdoors. Of course, we'll also need to implement the new training regiment that gets rid of any corrupt prosecutors or officers.

I'll happily posit things for the sake of discussion, but there's a point where I'm suspending so much reality that the discussion just breaks down. I can't stop looking at the proposed solution and saying, "yeah but this thing you're talking about doesn't work." I guess I'm just not sure what you're getting at.


”adults will be unfairly restricted” from what?

Asking for a friend.


Let's say Youtube's theoretical solution labels you as a child, even though you're not. Without some kind of way to get around that block, you've now lost access to Youtube.

Note that this scenario isn't entirely theoretical. It's rare, but there are already subsets of Youtube (trailers for horror movies are where I see this most often) that are locked behind a login screen because the uploader has indicated they're intended for mature audiences. You either log into Youtube (sacrificing some privacy) or you just don't get access to those videos.

This ends up being not a huge problem today, because the amount of content falling into that category is very narrow. Imagine a world where you fail Youtube's theoretical privacy-preserving test, and suddenly you can't watch anything, regardless of the content, that isn't designated as being specifically directed at children.

Of course, on some level that's just Youtube's choice. But now imagine if a government agency like the FTC dictates that every large Youtube-scale platform has to implement this test. Suddenly you find yourself not only unable to watch videos on Youtube, but also unable to make a Facebook/Twitter account. Imagine if you can't search for certain things on Google.

Note that this final scenario also isn't entirely theoretical with the FTC. My reading of the comments that the FTC commissioner has made is that the value of targeting Youtube is specifically that it's a centralized choke-point for this kind of content. I don't see any reason to assume that if creators migrate off of Youtube, that the FTC won't follow them and impose similar rules wherever they end up.


It’s about privacy not content.

(I have a lot to say about content, including suggesting specific solutions to specific problems, but I’ve decided to not air this in a public forum. Further discussion is pay gated. Sponsor me on github for more !!! :))


What is a "user"? Is it the owner of the account? Or is it the person using the account at the moment? When I'm signed into the YouTube app on my smart TV, and anyone who sits in front of the TV can use the app to play content, then who is the "user" of the app? Can you detect that my child is currently the one sitting there flipping through videos?


Its better to filter out content that is targeting kids.


Where do you think Google Analytics demographic data comes from.


I don't know. Enlighten me. And how accurate is the identification?


Accurate enough that they sell it to advertisers who want children to see their ads.

If their demographic detection was bad, they would have no revenue and no customers since that's the flagship product that they are selling.


The lowest age range YouTube Ads lets you target is 18-24.


There are plenty of age proxies that youtube happily lets advertisers target. Advertisers can tell youtube "Show my ad to fans of Peppa Pig".


My daughter uses my YouTube account to watch children's videos. She has zero interest watching anything targeted towards adults.

Content is likely the strongest indicator determining if someone is a child or not.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: