The nice thing about Netflix/Hulu/Amazon/PBS/etc. is that a human decides if the content is for kids.
YouTube is still trying to get away with "letting the machines do it", which works for most aspects of their business, but not this.
I used to let my kid watch YTKids while I supervised. But no more. Even with my supervision it just jumps to some insanely inappropriate video, too quick for me to shut it off in time, and it's already done damage.
When a scary clown jumps out and screams, that's traumatizing, even just for a moment.
I have no trouble letting them watch all those other platforms. Heck, I let them watch PBS unsupervised.
But YouTube Kids will never play in my house again until I know that every piece of content on there has been reviewed by a human before it goes live.
Citation needed. I mean, they do use AI for lots of stuff, but let's not describe it as working. It has routinely caused massive aggravation for their biggest content creators, it has been one in a group of changes that has basically made Youtube an unsafe monetization platform and driven the creation of places like Patreon,
Sure, this is the only AI fuck up that is going to get them sued in federal court and face charges of exploiting children, but it's not like the other AI stuff is working well. Basically, their AI is doing a bad job everywhere, it's just that this area is where they face legal liability and so they can't pull out their usual "but the machines told us to!" defence.
But you have a point. YouTube and Google get a lot of flack for things that are emergent properties due to their lack of humans.
YouTube doesn't release many of the details of the methodology or even the content that they block because it can then be circumvented by the people who create and/or circulate it. It can create a streisand effect which only makes that type of content worse by drawing attention to it.
Some of what you can see is in the child sexual abuse material (CSAM) APIs that Google is now sharing to help other organizations detect CSAM material en-masse in their content. These services that were originally built to do so on GMail, Drive, and Youtube.
It's inevitable that in such an environment the system they come up with ends up making less and less sense over time. It's death by a thousand cuts, and I don't think there's a good way to really avoid it.
On the one hand because they are this large, they cannot currate content. On the other hand they are arguing because they are this large they are the only ones who can currate this content.
I would feel more sympathetic to your argument if the companies wouldn't argue both cases at the same time.
I don't really care about sympathy, so much as I care about reality. The companies will do as all companies do. Try to make money by garnering confidence in their product & abilities. Even if they have to lie through their teeth.
Reality is, there isn't a good system (currently) to really curate massive amounts of user submitted video content, in an environment where anyone anywhere can submit said video content.
The problem's scale is always going to lead towards either a compromise of being an open video platform, or a compromise in relation to curation.
100% agree. YT kids is not safe for kids. I wouldn’t show it to my kids. I instead download the videos I want and manually make a playlist. There is wayyyyyyy too much garbage in YT kids.
Is this the best they can do as parents? I mean, we all know just dumb sitting on couch and passively consuming whatever goes in TV ain't healthy mentally (and physically), and they ingrain this approach in their kids since very young age?
Parenting ain't easy, never was and probably never will be, but damn invest some proper human time in it, don't just offload raising of your kids to some stupid website which lives off ads. This is best invested time of your life.
I have a 2-week newborn, and damn he ain't getting close to TV/phone/tablets till much much later, and even that in limited manner.
After 1 she also got increasingly harder to feed. She would throw away the spoon with her hand and not open her mouth. Just a massive rebel in general. We were really stressed out. She was starting to fall off the growth curve. Pediatrician was concerned too.
Then while eating we once showed her ABC kinds wheels on bus go round and round rhyme video, not sure what clicked in her brain. It was a night/day difference. No rebel , eats well.
So we kinda use it as the last resort and feel a bit guilty.
You used something they wanted as a carrot stick (couldn’t not) to get mutually beneficial behavior, you’re still gonna feel guilt but this seems well done to me.
Parenting is not something you can automate
I'd rather judge and shame bad parents now than have to deal with their poorly behaved offspring later in society.
Right now my kids are small. I think PBS kids should be enough for them. When they want to find something else to watch, I know they just had too much screen time and they need to do something else other than watching TV.
It's impossible to satisfy everyone, regardless of the content rules, and no mater how curated. YouTube could spend a billion dollars curating content, along with surveys where 90% of parents have to approve a video before it is included, and the remaining 10% would still be the vocal minority ruining it for everyone.
Sure, YouTube probably doesn't do quite enough. But when you have parents looking to let YouTube babysit their 2-8 year old children without supervision, whose fault is that? YouTube's, I'm sure. /s
When I taught kindergarten one of the children in the youngest class at the time was terrified of the Eensy Weensy Spider video and of the Elephant in a Hickory Dickory Dock video. I told her to close her eyes when it was coming up. She did. Eventually she got over it.
The idea that a scary clown can be “damaging” is a testament to helicopter parenting. Up to 1900 it would be a rare person indeed that didn’t have siblings die before they were ten and basically no one would make it b that far without having a friend or family member die. That’s trauma. That’s damaging.
You seem to think emotional trauma isn’t real. It is. And studies of its effects show that it is just as harmful to the brain as physical trauma.
If nightmares and screaming is repeated over and over again, it can lead to some very traumatic experiences in their young brains.
Previous generations also used to “instil” a sense of fear in to children by beating them. It does make them listen as they are then ruled by fear-inducing trauma and yet I think we’re all clear on how it’s not the best option for their growth.
And I never let the watch unsupervised. But at least with every other streaming service, I don't need to preview the content ahead of time.
I'll just use one of the many other services that does it for me.
"it's too hard"
which translated of course means, "the cost benefit analysis doesn't come down in favor of spending the effort".
Caveat emptor, informed consumer, personal responsibility, blah blah blah.
What the Freedom Markets™ advocates don't acknowledge is transaction costs. If consumers have to vet every single activity, the economy suffers.
If YouTube Kids is marketing themselves as kid friendly, it'd be nice if it was true. My skepticism budget is finite.
Forcing YTKids to only show PBS Kids content on every platform is a huge pain in the butt.
I'm just going to assume you are being sarcastic and making an ironic point about parents wanting to control what their child sees but then not putting any personal effort into it.
The harsh reality is that parents have to do work. I remember when I was a kid I hated that my parents looked through the books I checked out from the library to see if they were "appropriate" and that sometimes they would remove horror novels and other things that they thought were not age appropriate.
But the truth is at least they cared enough to check and at least they were consistent, and later on when I became a teenager they gave me more freedom. If you have opinions about what media your child has access to you have to do the work to curate...
Of course I want to review the content too, which is why I usually watch new stuff with them. But I want the service to take the first go at it. Like every other streaming service for kids.
Same thing at the library. When I get a book from the kids section, I can be reasonably sure that I won't be surprised by some adult content when I sit down and we read the book together.
I’m not arguing that YT is inherently bad for all purposes; just that, in this situation where you want to watch kid’s videos and be confident all the material will be age-appropriate, there are lots of apps that are simpler and better.
Rather than installing YouTube and whitelisting PBS Kids, why not just install the PBS Kids app instead?
Too much work. I'll just watch one of the many other services that do it for me.
It still can be someone else's problem.
Also, they still don't make it easy to lock down to just that content on every platform.
I don't have to worry about this with the PBS Kids app. I know that no matter where it runs it will be safe. My house, grandma's house, whatever.
Same with Netflix Kids. I don't have to worry about it having bad content anywhere my kid figures out how to use it.
I’ve never really understood how YouTube Kids was even a thing.
It's fine for Youtube Kids to be a much smaller selection of manually curated videos from specific partners, because Youtube Kids isn't replacing Youtube as a whole.
Given that their current solution basically kills ad revenue on those videos anyway, it's not like creators are going to be clamoring to be included on Youtube Kids in the first place. There's no harm in having a subset of Youtube that is highly gatekept, the same way that subsets like Youtube Red were.
Go ahead and let Youtube itself stay as a kid-unfriendly, algorithmically moderated wild west with hidden bombs of objectionable content spread around. That doesn't need to be gotten rid of, just have one corner on a separate subdomain that's nice and safe for kids.
That's exactly what the FTC fuss was about, though. It's easy to say that until a parent gives their kid access to it and suddenly YouTube is advertising and recommending stuff to a child, which the law doesn't provide a safe harbor from (not for the content uploaders, either, which is going to cause a lot of problems). The FTC guidance was to identify kids instead by the content they were accessing. Hence Google's complaint: https://youtube-creators.googleblog.com/2019/12/our-comment-... (tl;dr skip to the "Treat adults as adults" section)
edit: now I'm confused because your comment below (https://news.ycombinator.com/item?id=21975870) indicates you already know your idea here isn't going to work for the FTC?
To be blunt, even the FTC's own proposal isn't going to work for the FTC (at least not to the degree they want) -- it's a bad solution with unintended consequences. I've brought up a few problems, but other people have brought up far more of them. At a certain point, beyond explaining the FTC's policy and showing the problems, it's not really worth spending the time trying to come up with a way to make it work. It's better to come up with solutions that would address the concerns of actual parents, like jedberg.
Importantly, the FTC's approach is the opposite of what parents like jedberg need. Parents need a highly curated feed that they can feel safe showing to their kids. The FTC's approach is going to lead to creators erring on the side of labeling too much of their content as kid friendly. It's also going to force Youtube to get more aggressive with its algorithms, which will inaccurate label objectionable content as kid-friendly.
Parents needs fewer videos more carefully classified as kid-friendly, and the FTC is making sure that a lot more videos will be carelessly classified as kid-friendly.
Like I said below, my assumption based on the FTC comments I've read/watched is that they just haven't thought that far ahead. They're mad about Youtube in general, and they see creators as an easy way to get at Youtube. That's maybe an overly-cynical take on it, but I'm not super-inclined to be charitable to them on this one.
They don't. The cost of providing the service they pretend to provide would greatly exceed the income on that service. Even with the most aggressive advertising one could imagine.
You can imagine how Google would automate some nightmares otherwise. Say they look at the age of the people viewing it and if enough people below a certain age view it, they automatically class it as content for children.
Of course ideally something in the middle would be a start, so they could see, what are children watching, let's check that out and see if it is in need of some 18+ filter or not.
Given they must be doing some checks, as in age for logged in accounts and making sure adverts are not something not for children.
>But YouTube Kids will never play in my house again until I know that every piece of content on there has been reviewed by a human before it goes live.
Yes, as it stands the systems Google have in place are not secure and by that, unsuitable content could find its way to your children when you have done all the right things to prevent that. Equally there is the accountability. If PBS drops a boob at the wrong time when children are watching, it will get dealt with and process in place. If YourTube kids do that, nobody else may know as nobody is checking an individual's stream over a network broadcast. Let alone the same level of accountability and process.
Yes we absolutely can. YouTube's business model depending on user generated content is their choice. Not being able to review all the content is entirely a problem for Google to solve. Maybe they should stop optimizing for growth and put up some hurdles to who can upload content? That's for them to figure out.
I'd be surprised if there weren't a bunch of kids who really wanted to see content from the latest Minecraft update.
Sure, it will start shifting once they want to watch something their peers watch. But at the time parents choose the media, how would they even know it's not the latest production, or some mix of old/new?
Peppa pig started in 2004. At the time kids are interested in it, if you show them ep 2, will they say "this is not from series 6 from 2019"?
Even with a more restrictive approach Google/YouTube could whitelist specific content providers, which could provide fresh content on a frequent schedule.
I was about to suggest that content creators pay a nominal fee for the content to be reviewed but if creators have no way to monetize the platform, that proposal is bijective with suggesting that Google just shut down YTKids. Or are parents paying already for access to YTKids and the reviews would come out of that?
The argument here is whether imperfect curation is better than none, and the history of television tells us that it is. We never had a TV channel that randomly showed kids pro-suicide messages.
This sentiment is so shocking that I had to say something.
Please no. Let's not encourage a world where people are even less free to share their ideas.
I get where you're coming from, but "Think of the children!" has been used to justify so many evils throughout history. It's an emotional appeal that should be resisted.
Putting up barriers would also be the first step towards losing dominance. But in the short term, it would suck not being able to share a video without waiting a long time or being approved.
But won’t someone think of the poor content creators who might have to wait or upload their content somewhere else!
"Think of the children" is equally bland.
somehow youtube became the parent that oversees what content your kids get. this to me is a dereliction of your parenting requirements.
I think the issue for here is the YT is trying to claim they are providing a YT kids offering, comparable and competing directly with Netflix, Amazon, PBS, etc - if they want to make this claim and have parents allow their children to view it then they need to back it up.
edit: I am sure there are more checks in place and that the above idea has been considered. My main point is that they need to take responsibility for whatever claims they are making. If they can't back up the claims, then that is fine - just stop making them...
Crazy idea - maybe no one should be operating at that volume until they can figure it out?
If you don't want your kids on a product, then supervise them. Don't ban the product for everyone who can use it responsibly.
I'm certainly not one to say everything should be as safe as possible for kids, but there's a limit to what a parent can reasonably be expected to understand. If I set my kid up watching a blues clues clip, and look away for 30 seconds and he's seeing a clown with its arm torn off which autoplayed, I feel like I'm reasonably kinda entitled to go hey wtf google, your shit is broken and its harmed my kid.
And then, of course, not let my kid use youtube anymore.
YouTube throws up videos to children that look good until you get to the second of the video some psychopath has spliced into it. It’s literally not possible to safely put your kid in front of YouTube unless you have pre watched every video YouTube might potentially auto services up to your 3yo
It won't stifle communication on the internet at all. Trust us.
The problem with excessive regulation is that complying with it can be extremely complicated and time consuming. If you don't have the money to pay for that, you either risk being fined or sued, you don't play.
It sets a bar that only already wealthy people can get over.
I hope someday someone will read it.
HN: parents need to stop being helicopter parents and kids need to have independence to grow in order to be functional adults, otherwise you're a shitty parent
Also HN: parents need to know everything their child does and know what they are watching at all times, otherwise you're a shitty parent
If they are honest about there lack of actually effective triage then this whole controversy is of a very different valence.
Stop pretending you’re triaging well if you’re not!
1. it's from a trusted provider, send it on through. Add a complaint mechanism that makes it fairly easy for a trusted provider to be removed from this list (DMCA for content, if you will).
2. it sits in a queue until a human looks at it. Prioritize videos from those who were recently removed from the trusted provider list as a way to dampen the harm from abusive reports.
Suddenly, it's a much smaller problem.
The only way this works is to have a human review every piece of content before it goes live, like every other streaming service with a kids section does.
I don’t have kids, but this seems a bit over the top. Children live in the world with the rest of us, and are exposed to things every day that haven’t been reviewed and approved as child safe. Angry drivers on the road may flip you off or yell at you while your kid is in the car, or someone at a stop light may be blaring music that you find to be offensive and not approved for consumption by children. Both of these things would objectively be scarier than a clown on a computer screen, yet kids see and hear these things every day and somehow most of them don’t grow up to be serial killers. Sure, try your best to shield them from harmful content, but completely shutting down something that is 98% clean, educational, child friendly content seems like throwing the baby out with the bath water.
We are not talking about some mildly offensive or violent behaviour. We are talking about things that would be illegal to show on prime time TV (child and adult timeslots)
10 great child videos does not make up for 1 psychopathic murder scene spliced into video 11
Have you ever personally witnessed a “psychopathic murder scene” being spliced into a YouTube video? Do 10% of videos really have this in them? I have never seen it once, in well over a decade of relatively heavy YouTube use. Most of my friends are in the tech scene and have children, and none of them have ever mentioned this as an issue either.
wrt to piracy: keep in mind that parents are often relatively financially disadvantaged as well as being the most financially targeted group. Some parents turn to youtube rather than (say) disney+ for a reason. Those that can afford it do because it is less mind time and real time than having to do even community curation.
My complaint is the opposite: lots of good, sensible content (most recently I was trying to set up my youngest to watch drum cover videos) just isn't there because no one thought to put it there.
You have to whitelist a few channels, then it works ok. Though my daughter is only 3 now, and would be happy watching Blippi all day. Seems like it would be more complicated to whitelist channels as kids get older.
OK, that's just not true. There's a ton of great stuff on YT Kids (though as I mentioned not enough). But yes, you have to find it for them and show it to them yourself. Left to their own devices most kids aren't going to make good choices. They're kids, that's the point.
This is just the "be a parent" thing, really. There's absolutely no way to get around the need to educate your own children. Certainly Google isn't going to do it for you. And Google certainly didn't invent junky content; I remember staring at Hanna-Barbera garbage constantly as a child.
The funny thing is, I had to choose my child's age range (or enter the birth year or something) when setting up the app. Then, when I went to whitelist channel by channel, it had a bunch of recommended channels full of good/decent content. That should have been the default whitelist out of the box.
It isn't just about "being a parent". My daughter is very smart for her age. My wife used to be a teacher, but since we had kids, she stays at home with them. She has taught my kids a lot. She has learned a good amount from content on YouTube, Netflix, and Disney. I think it's good that she can use an iPad better than my parents (even though she can't read). She also can solve puzzle games on her own. Of course we guide her and limit her usage. My point is that an app marketed to kids should not be full of garbage content.
Ps: I'm not trying to change your mind, I'm trying to understand the reasoning behind it, coming from a more liberal raising and as a result a more relaxed attitude to what is banned and what is not banned.
As a big Leave it to Beaver fan I find this hilarious. And would pay anything to watch this episode if it existed.
The big tech companies are so far removed from real users that they'll just tighten a few screws on their machine learning and call it fixed. You're a roundoff error to them, unfortunately. These people are the real machines, servicing the electronic machines to make money for their masters, the execs and stockholders.
"As a parent" is one of my favourite phrases, by the way. It translates to "I know better than you, regardless of your parental status". Which is fine. I was only wondering about the reasoning, after all. :)
I sympathize with this. Lots of parents are vociferous morons, because lots of people are morons and becoming parents does not fix that. But: There is a real difference, at least for many people. Your brain rewires itself to be hyper-vigilant about child safety and child harm. I can't watch TV or movies anymore that involve harm or the threat of grave harm to children, but these wouldn't have bothered me before. When I enter a room, I subconsciously scan it like Jason Bourne if he were a toddler's bodyguard. Slippery surfaces? Sharp table corners? Unfenced stairs? I've compared notes with friends and this is a common, if not universal, change in parents. They spend a lot of time thinking about child safety.
Yes, if you are worried, the smart thing to do is not to expose your kids to YouTube at all. Wishing that YouTube could be better is independent of whether you choose to expose your kids to it.
Also, it has nothing to do with being liberal or conservative, at least with me. I will let my kids watch a lot of stuff, if we intentionally pick it out. If you saw random baby shark or finger family between whatever videos you wanted, you'd be right to complain and no one would blame you.
There is zero product design behind the way kids are exposed to scary things. No product manager decided it should be that way. It just is that way because it's not prioritized by the business or it's too expensive.
Imo if YouTube markets at kids they should be accountable. They should clean it up.
My kids are too young to have unrestricted access to the internet. That's why they are limited to specific kids apps.
But YTKids is different than all the other kids apps. All the other apps have a human check all the content before it goes live.
So I can be reasonably certain that the content will be ok without watching it all first.
With YT, I can't have that certainty. So they just don't get to use that platform anymore.
Yeah, bad things happen, and when they do I talk to my kids about it.
But I don't want to purposely subject them to such things if I don't have to.
We let ours decide what to watch / play out of a handful of things we've hand-picked as OK (such as Daniel Tiger's Neighborhood or Super Simple Songs), and iOS's guided access mode has been a godsend for this.
Super Simple Songs is on YouTube and has great content for kids, but I refuse to use YouTube Kids as we tried it once and couldn't control it well enough—weird videos started creeping in almost immediately. So, we use regular YouTube and disable autoplay, which we have to manually check every single time, because YouTube has randomly re-enabled it for us a couple times which led to the weird videos again).
We do limit her screen time (and consequently, ours), but trying to keep her away from them 100% of the time? That's honestly not realistic, nor fair.
At our daughter's previous day home, the provider started letting her watch a YouTube channel called babybus (against our previous agreement of no TV, which was the straw that caused us to quit her service). At first glance the show is innocent enough, it's just a bunch of cartoon characters singing and going on adventures. But if you actually pay close attention to some (not all) of the episodes, its a weird fear-based morality show from China that threatens death if you don't behave a certain (rather authoritarian) way.
The frustrating part about it is that there is a ton of really great kids' programming on YouTube that simply wouldn't be possible without it. It just takes an order of magnitude more effort to sort it out than pretty much any other service that provides content because you have to curate everything yourself.
I ask out of curiosity and personal learning, not out of any sort of judgement. I do not have children and if I did would likely not have had so in time for this to occur.
You don't realize just how much free time you really have until a toddler demands it all away from you. Meanwhile, dinner needs to get cooked, dishes need to be done, laundry has to be washed and folded, the house needs to get cleaned, etc. And your kid wants / needs attention every few minutes (at least ours does—she can only colour or paint or play with toys for so long by herself before she wants to play with you).
First thing I do on a kids device is ban the YouTube and similar apps and set web to a whitelist. Check the screen time settings.
My solution has been to curate specific videos then embedd them in html pages and turn off the suggestions that would normally run at the video.
Yes, it’s more work, but I see real benefits in this approach.
They're children, not small adults.
Or if it is, they need better hands, because they're terrible at it.
The word for this is "parent".
Many kids do not follow these rules. They get punished and have valued things taken away. Some learn a lesson and will follow the rules and others do not.
The ones who still don't listen have to be with you when you cook and clean. If you go out another parent/babysitter/old children will watch them.
What rules can you set in place such that your kids don’t accidentally see something explicitly targeting them with gore or other traumatic content?
The only rule that seems to make that possible is “don’t use YouTube Kids”.
It is not Youtube's responsibility to raise your children.
I think this is where you lose me - why is allowing a child access to have un-tethered access to a content portal appropriate? i.e. why is the responsibility on the content portal (YouTube) and not the parent or supervisor?
I might be a tad too much on the individual responsibility side of the spectrum here, but just like the old argument from the 90's ("The TV ain't a replacement for a babysitter...") I'm not sure the responsibility necessarily needs to be on the platform here over the parent/supervisor.
One suggestion: You can create a white-list or playlist of pre-approved content creators and/or videos, and offer access that way.
That’s parenting - avoid the shitty services that can’t put out a good product so you can focus your effort on something more meaningful. Or YouTube can fix the problem and it will be allowed in my household.
The same trust cannot be extended to YouTube.
If YT can't and won't do it, they shouldn't label it as kids content. They have a separate product/brand for it, so it should be what it claims to be.
because given how easy it is to access content on the user side there's no other way to do it feasibly. You can take the tv remote away from your kids, and that only turns the TV off. Tablets, pcs and so on are general-purpose computers with plenty of workarounds, it's extremely hard practically to stop kids from accessing content that they can technically access without greatly diminishing the value of the device or service.
Content creators, who are not intending to target children with their videos, are taking steps to ensure they're not ensnared by the onerous definitions the FTC has given for "children's content." The FTC has indeed explicitly said that the content creators, not YouTube, are responsible for flagging their videos as "for kids," and the content creators are personally liable to pay a $40,000 fine if the FTC thinks they made a mistake . So videos where adults dress up in costumes, producing videos for other adults, but that a kid might like? Yeah, they don't wear those costumes anymore, because the FTC might consider it targeted-to-kids.
Meanwhile, actual garbage aimed specifically at children, like Elsa-gate and animated shorts with instructions for viewers to commit suicide, go untargeted by this non-fix.
As I understand it, if he marks his videos as being for kids, they end up effectively demonetized and comments aren't allowed. If he doesn't mark them as being for kids, he's at risk of massive fines. If he marks them as adults-only (or just starts swearing or the like), they're also effectively demonetized.
Guess this explains all the push by a lot of creators towards Patreon and the like - the simple solution is to have your videos on YouTube marked as for kids, but to push for people to support you on Patreon to get access to discussions, a channel-specific Discord, etc. for the kinds of things that would previously be in comments.
Edit: Oh, and no notifications when new kid-friendly content is posted, so if you want notifications 'Support me on Patreon'
The blog post from Google mentions no longer serving personalized ads, which sure is much less effective, but it's still monetization.
In a weird coincidence my SO told me a few minutes ago about how Youtube is cruel right now by showing her DisneyWorld ads. It's clearly happening because she watched a bunch of "Into the unknown" video clip from Frozen 2 recently.
It made me question 2 things, do they already implemented theses changes, because to me, any content from Frozen 2 is targeting kids, but more importantly does the COPPA thing apply to ads.
If it does apply to ads, any Disney ads are clearly targeting children, so any decision to show theses based on tracking should point toward tracking that target children.
we need good decentralized platforms, no?
That would be a combination of "get us in trouble and you're in it too" and "as you can see Your Honor, we've made strong attempts to enforce keeping children out of the community."
to replace youtube, it'd have to be both decentralized video delivery, discovery, social as well as decentralized monetization -- which i think could be patreon style or mayyyybe some sort of ad delivery.
not sure what legal ramfications would be exactly, but i certainly hope we aren't in a place were otherwise-valid enterprises would be banned unless you could (impossibly) prove kids weren't using it.
(also there have historically been lots of people making $ off providing targeted-to-kids content (toy unboxing, slime videos, tasting candy, &c), and it's not clear to me that's a bad thing...?)
 The number of creators I've heard get within a few days or a week of permanently losing their account because of some weird combination of things and only got it back because a friend of a friend, not their official YT liaison even, was able to find out what was wrong and how they needed to fix it.
How about RSS? Each channel has RSS, and iirc it has only failed once (as in, a few days with no updates even though there were new videos) in the last 4-5 years. I don't use Youtube's system for following channels anymore, I just have a YouTube 'subscriptions' category on my feedly.
"Have your viewers use RSS for notifications" is going to work as well as telling them to have all their viewers use Opera or IE.
Can you clarify on this? So, you're saying that the FTC will punish content creators who _don't_ flag content as "for children" if it winds up being "targeted-to-children" via some subjective notion regardless?
Those videos wouldn't show up on YT Kids, correct? So they're not going to show up for kids anyway...right? That sounds like the way things should work.
In the Ars video, the FTC chairman says the following (16:30):
> "On the Internet though, you don't know who your users are. So what the COPPA rule more specifically does, is, 'where the content is directed to children, we are presuming the users are 12 and under.' [...] If however you are a general-audience platform like Youtube, you don't know what the content necessarily on your platform is. [...] That's what we mean when we talk about strict-liability for content creators."
To me, I interpret this quote (as well as the rest of the video) to say that the FTC does not care whether or not the platform is directed towards children. Keep in mind that Youtube is already officially designated as not directed towards children under 12. If your child is using Youtube, you are violating their TOS. My personal interpretation of this video is that the FTC is saying, "we don't care if the platform is for children, if the content is for children, the rules apply -- no matter where you put it."
As to whether or not content creators should be liable for Youtube collecting data that the creators don't even have access to, that's another question. But the FTC doesn't seem interested in asking that question. It's pretty clear to me that the chairman views creators purely as a means to getting at Youtube, regardless of what the consequences are to the individual uploaders. Creators are the "fish in a barrel."
Looks like the two options are either the video is removed, or put behind an age gate.
Isn't the FTC's issue here that they simply don't believe YouTube when it comes to an account's age? A kid could be watching the targeted-to-kids content and still be on their older sibling's account, or they may have lied about their birthday (who didn't?) - either way, their argument is that YouTube and content creators are violating COPPA if the content itself is what they consider targeted to kids, and a kid watches it - regardless of what YouTube claims their account age to be.
So, putting the creepy videos behind an age gate - seems to not be a valid defense in light of the FTC's reasoning that age gates are not reliable. Which is why their silence on this side of the issue is confusing to me.
Unless there's some kind of tunnel vision going on here, and they're worried about the ad metrics or whatever that YouTube is generating from minors - and the FTC just doesn't care about the actual content of the videos, so long as you're very careful about what data you collect and how you monetize videos a kid might watch.
Of all the free services available online, YouTube is maybe the single worst example you could pick as far as "offering little actual user value".
The fact that any user across some 76 languages can upload any number of up to 12 hour videos that are then transcoded, captioned, and shareable FOR FREE is absurd.
If those videos never get a more than 10 views, it's still hosted and accessible FOR FREE. If they blow up and get billions of views, the service scales to serve that FOR FREE.
If you get really popular, you can CHOOSE to show ads on your videos and get paid for views while still taking advantage of the free hosting and infrastructure. It's a goddamn miracle that individuals creating and publishing their own web videos has become a viable occupation. That didn't exist before YouTube.
Even more impressive is spending money and resources hosting all of the videos that will never be viewed. A user here posted a website that randomly played a continuous feed of never before viewed videos. It's astounding.
All this while also operating the second largest search engine in the world to help viewers find your content for free and being so reliable as to make global news when it goes offline for some users for an hour.
I can understand criticizing some of YouTube's policies or interactions with mega creators, but I legitimately can't comprehend calling their contribution "little actual user value".
 - http://astronaut.io/
 - https://www.cbsnews.com/news/google-cloud-outage-hits-youtub...
They've pushed everyone else that could compete out, because they've achieved such economies of scale and vertical integration that they can afford to build many products AT A LOSS, because only at their scale can one actually make money on the product.
A $50 google home speaker, or amazon alexa should be $200. They're selling it to you basically at cost so they can get your data. Similarly storing all those youtube videos - it costs them money. This deal isn't free for the consumer, its a tradeoff that you HAVE to use them, the company store.
Playback in the Miniplayer
Save to playlist and Save to watch later
Likes and dislikes on YouTube Music
I guess it's hard to do these in a way that is compliant with the privacy rules, or they're worried that kids will donate $1000 to their favorite Youtuber with Mom and Dad's credit card? It will be interesting to see how good their machine learning is at identifying "kid's content". From reading the FTC page, the delineation seems a bit arbitrary. I suppose we'll see if there's a financial impact on Youtubers that are borderline (is someone playing a video game considered children's content? What if the videogame is rated M? Will the tiebreaker be statistics about their actual viewers?)
Though if that was the motivation I'd expect them to also disable auto-play of next video for content targeted at kids.
watch hours of YouTube recommended up-next videos!
I rather like the hard handed approach as a parent (who can be both lazy and diligent) as I think it reinforces good parental control of video content.
This will force lots of behavior changes for both consumers and producers. The key will be how they handle the suggested videos, which may have unintended negative consequences, as I am sure the UX will still be based on generated maximum ad dollars, targeted or not, and not "healthy" video viewing.
Also, on mobile that would require you to exit the app and move to a browser. Figuring out how to pull up a video from the app in a browser requires a certain level of tech fluency. (You have to know to hit "share", "copy link", paste it into the browser, then somehow force the browser to not punt you back to the YouTube app.)
Guess we will see.
The introduction of a world where everything is 'fair' and 'nice' controlled by 'authorities' is something I notice often in those shows. The faces are big and stretched out from reality as a distraction from the theme.
Some of them encourage weird experimentation that can go wrong easily while pushing for odd stereotypes.
Good guy beats the bad guy with violence... punitive justice.. fair bit seems sexual, promoting eating disorders, creepy parental expectations and misleading nutritional science.
I'm just curious, what do you find not to be family friendly about the show?
So that may not go over so well with, for example, a Jewish or agnostic family.
"Their aim was to produce children's videos which conveyed Christian moral themes and taught Biblical values and lessons."
The observable (and controversial) effect of this is an outcry by content creators, I suspect the measures they take around accounts identified as children are much less obvious and entirely orthogonal.
I think that's a bridge too far. Unfortunately, I don't think anyone would actually want a world where ad-targetting data could be used as a form of ID to prove your age...
If you want the data to be as good as an ID... then everyone would need to offer some kind of government issued ID in order to create an online account, in order to make sure they weren't a child. Anything short of that, and you're going to have accounts that porport to be 21 when the 'real' user is only 12.
Or maybe you think they should use Machine Learning? So you then monitor every youtube video the 21 year old user watches, and determine that they are watching My Little Pony content over and over... so surely they must be a child. And then lock down the account and prevent them from making comments/purchases/etc?
The child protection laws are very strict. I don't think any of us would like a world where you get treated like a child (e.g. no right to view or participate on most of the internet) because of some algorithm. Civil suits against content creators seems like one of the only viable ways forward to me, as stupid as it is.
I guess if you are speaking of "what is best for society", ok, but that's a completely different thing. It sure seemed like you were debating the post above, but I guess not.
It was my own post and it had nothing to do with legal liability. As I said (clearly, I thought), I think Google is reluctant to show the world just how unreliable their demographic data is.
Theses whole changes are happening because they were tracking childrens (well they were tracking everyone, which included childrens). They can't store any data from a child, none at all. That means that they can't detect if it's a child, and even if they had a magical way to do it, storing the "isChildren" variable set as true would be illegal in itself.
COPPA decided that a video targeting child was a proof that they knew child were watching, and thus tracking them. So instead they move the responsibility to the content creator to say whether the content is for kids or not (thus not storing data on the child, while still being able to not track the user which is now assumed to be a child).
The only way they can store data about a child require parent permission.
Targeting content aimed at children is... I'm not going to pretend it's good, or even long-term viable (malicious videos are still going to slip through). But it passes the extremely low bar of being better than age-verification gates.
Google figures it out the same way they infer any number of things about you and use it to resell advertising. If they can figure out someone is a woman between 19 and 25, I'm pretty sure they can figure out with a good degree of accuracy that a user is under 15.
There is pretty much 0 way that you could say that purchasing a universal communication device is carte blanche for a child to view any content online.
Not always. Minors are allowed to have cellular with parental consent in some cases.
Course the data is all there, but they just really promise they never put it together and knowingly went after kids.
Will this help?
To start with, I'm not a huge fan of the idea that the Internet as whole could be legally obligated to assume I'm a child unless I provide every site I visit with a drivers license, for example.
But let's take your assumption to the next level: let's assume that there's some privacy-respecting way of doing this (I don't believe there is, but we'll temporarily pretend it exists).
I'm also not a huge fan of the idea that the Internet as a whole could be legally obligated to assume I'm a child based on criteria that are outside of my control. I personally think most of our experience with filters online have taught us that they're best implemented close to the consumer, not close to the source.
To be clear, both Youtube and the FTC are turning this into a dumpster fire. But the idea of filtering content that's designed for children is better than the idea of filtering consumers. It's just that this implementation of that idea is garbage and unnecessarily hurts creators.
(What content/functionality is available to what user is a different question, and, I think we both agree, should be left for the provider to decide.)
(And I always find internet-as-whole arguments as a straw man. We are taking about YouTube and Kids. Nothing more.)
> Filtering? It’s a discussion about privacy
The point being, there's no way to determine whether someone is an adult without violating their privacy, and even if there was, the false positives would be (imo) an unacceptable tradeoff. If you want to have a version of COPPA that works without violating privacy, you have to focus on either labeling or filtering content that's uploaded to a service. Trying to filter users based on their age just doesn't work.
(There’s an elephant in the room. All-of-internet wants it straw elephant out (I think). The lawyers just got confused.)
Even in that scenario, content filtering is more feasible and practical than detecting and banning children.
But, pushing that problem aside as well, sure. Assuming that you didn't have false positives, and assuming that you didn't have to violate anyone's privacy, and assuming that the solution wasn't obscenely expensive, that theoretical solution would be very helpful. It would basically solve the entire problem, for pretty much everyone.
But that theoretical solution doesn't exist.
If we're willing to make those assumptions, I would also like a special gun that only shoots bad people and doesn't work in schools or churches. And it would be nice to give law enforcement with warrants access to lawful information without requiring encryption backdoors. Of course, we'll also need to implement the new training regiment that gets rid of any corrupt prosecutors or officers.
I'll happily posit things for the sake of discussion, but there's a point where I'm suspending so much reality that the discussion just breaks down. I can't stop looking at the proposed solution and saying, "yeah but this thing you're talking about doesn't work." I guess I'm just not sure what you're getting at.
Asking for a friend.
Note that this scenario isn't entirely theoretical. It's rare, but there are already subsets of Youtube (trailers for horror movies are where I see this most often) that are locked behind a login screen because the uploader has indicated they're intended for mature audiences. You either log into Youtube (sacrificing some privacy) or you just don't get access to those videos.
This ends up being not a huge problem today, because the amount of content falling into that category is very narrow. Imagine a world where you fail Youtube's theoretical privacy-preserving test, and suddenly you can't watch anything, regardless of the content, that isn't designated as being specifically directed at children.
Of course, on some level that's just Youtube's choice. But now imagine if a government agency like the FTC dictates that every large Youtube-scale platform has to implement this test. Suddenly you find yourself not only unable to watch videos on Youtube, but also unable to make a Facebook/Twitter account. Imagine if you can't search for certain things on Google.
Note that this final scenario also isn't entirely theoretical with the FTC. My reading of the comments that the FTC commissioner has made is that the value of targeting Youtube is specifically that it's a centralized choke-point for this kind of content. I don't see any reason to assume that if creators migrate off of Youtube, that the FTC won't follow them and impose similar rules wherever they end up.
(I have a lot to say about content, including suggesting specific solutions to specific problems, but I’ve decided to not air this in a public forum. Further discussion is pay gated. Sponsor me on github for more !!! :))
If their demographic detection was bad, they would have no revenue and no customers since that's the flagship product that they are selling.
Content is likely the strongest indicator determining if someone is a child or not.