You also can't just white list things in the Youtube Kids app. There's straight up no option to be able to affirmatively choose what your kids get to pick from.
So now YouTube is just straight forbidden unless I'm picking the videos and that effectively means my kids don't see it at all outside of when I have a fun video to share... and even then I'm wary of what YouTube might suggest what to watch next...
With YouTube it feels like no matter what you do you're in a very sketchy neighborhood where there might be a good house here, but maybe some hell hole next door that Youtube is more than happy to send you to... It occurs to me that while I loved the promise of an internet that offered all aspects of humanity, on a site to site basis, I don't think I want that, even for just me.
So now we're back to the PBS Kids Videos app as the only route my kids get to independently pick what they watch. I'm pretty much done with YouTube for now as far as my kids go just due to the rabbit holes of terrible things on there, and the one time they take a shot at kid friendly stuff, it's really doesn't empower me as a parent.
I'm really enjoying Odd Squad.
I "whitelist" YouTube for my kids by using youtube-dl to scrape the few good channels for them (Lah Lah, Peppa Pig, etc). I pay for YouTube Premium already, and I only scrape from official channels, so I'm not pirating anything but rather directly accessing channels I paid to view, without having to go through YouTube's horrible interface which constantly (and purposely) pushes crap into my kids view that I never asked for.
Other than that, my kids aren't allowed on YouTube. They can watch scraped videos from our NAS or ABC Kids (Australian Gov-Curated kids TV).
Say, a whitelist for Youtube. In instances where Youtube recommendations fail, they'll just see a black thumbnail/ no text, and won't click it. If they do, can just block the playing of the video too. It will teach them skills with finding what they want online, etc, while you're at it.
I'm guessing this is modern ABC Kids too, because a decade or so ago, ABC Kids had some oddly dark stuff that's stuck with me for a while now...
Your way's probably sleeker, especially if it's hooked up to a NAS + XBMC or something of that sort, though :)
We're pretty strict about screen-based entertainment never being wasted time and always having some practical benefit, and ABC Kids has been great.
For our YouTube stuff I simply have a youtube-dl bash script I pass a URL or playlist too and it scrapes it at highest quality MP4 then copies it to the /Kids/YouTube/ folder on the NAS, which the TV can natively play from. I'm working on a web app running local-only so we can do it from our phones too.
This can't work forever as it effectively removes all the advertising, at some point they're going to start injecting the ads and suggestions directly into the video stream so even tools like youtube-dl consume the garbage.
But it's a marked improvement over using youtube as Alphabet intends, for now.
I am fine with Netflix as a reasonably decent filter as well.
There have to be other options...
I've been wondering about this for a while but I don't have any great ideas.
But as a parent without white listing ... not sure I trust them.
As far as google goes they were founded based on search, and search doesn't ask you what you want. Google doesn't ask me what ads I want, they don't ask me what news I want. In fact when they let me say "no news from this source" or "i'm not interested in this topic".... they still just sent me that news anyway, even over things I said I wanted to see.
It seems clear at this point that AI, or whatever overly complected Python script everyone uses will be pointed AT US, not for us.
I thought one of the big takeaways of the recent LinkedIn court case was, "if you're making it publicly available online, you can't use the law to distinguish between scrapers and regular people." And it doesn't seem like an aggregator would be breaking any copyright laws just by linking to content.
Use case #1. If Youtube only allowed people to upload videos (whether for a fee, or no charge) and share them with people they know, there would be no issue. You send videos to your friends.
Use case #2. If Youtube allowed advertisers to create videos and share those links via channels they control or purchase (e.g. paying a fee per view), there would be no issue. Companies you know market to you by video.
The self-created problem that Youtube has is that it wants to make money by charging for showing category #2 to those in category #1 who didn't ask for it.
Youtube / Google could choose not to have this problem tomorrow by not selling ads, recommending random videos to people, etc.
It's their obligation to figure out this problem, and totally up to them how to solve it. It's not some social or government problem to fix.
I think the problem with Youtube right now is their inability to respond nimbly to unwanted trends. As a human society, we've decided some things are so bad we're willing to ban their production and trade (human slavery, weapons of mass destruction). Some things are pretty much harmless (shoes, pizza). Lots of stuff falls in the middle (payday lending, pyramid schemes). Humans are exceptionally good are creating new stuff on each part of the spectrum, but Youtube seems to not be able to get a grip on the stuff near the bad end.
OP isn't saying that Youtube shouldn't exist, but that they chose to exist with that business model and so they need to solve the problems that come with it.
Leaving YouTube out of it, if you become extremely popular doing a thing and you have a million or ten millions sets of eyeballs tuning in to watch you do that thing, you will become an advertiser (unless you somehow think it's immoral).
That's how consumer-based economies work. We don't buy things just because we need them or ask for them. Therefore, we are advertised to.
Section 230 enabled lots of internet businesses, but it also created lots of negative externalities that are only now being reckoned with after decades of a free-for-all.
Social media companies can and do solve this problem for countries like Germany (set your Twitter location to Germany, watch how many people disappear from it). They choose not to elsewhere.
There’s no human right that says these businesses have to exist and/or be as profitable as they are. Societies can and should decide that.
careful here. It might lead to requirement Google to require identification from users, KYC as in banking. That would be orwellian creepy.
The Ricegum/Jake Paul gambling problem exists because there's just no easy way to determine and police the financial influence of a particular video. Not to mention the big money that YouTube makes by allowing covertly sponsored content by these huge YT stars to proliferate on the platform.
It's hard to see a world in which they get the incentives to generate content under control while driving the engagement and revenue they're making now.
It's their business model, so, no chance
Of course, there are also people who want to get seen without necessarily getting paid for it. Advertisers and propagandists.
(This is why "freedom of reach" matters; you might argue that youtube shouldn't take down fascists and terror videos, but you can't argue that they're obliged to say it would be a good idea to watch them)
Yes, let's. It arises from parents letting kids not only watch but outright discover content unsupervised.
If it weren't YouTube, it would be [insert name of threat to kids].
With both parents working, supervising kids is easier said than done – I'm not saying it's their fault for having to go to work to support their families, but the responsibility of policing their own kids is certainly theirs and nobody else's.
Please, tell me how I can whitelist channels and/or individual videos on youtube for my kids. I'm not being sarcastic - I've done a lot of looking and can't seem to find out how.
There are settings for "keep it kid friendly" but that isn't enough. Some kid friendly videos are terrible for kids.
We've resorted to only letting the kids watch videos on a fire tablet, because we can control exactly what they can see on it.
Even then I wouldn't trust most channels; a channel could sell out and have elsa&spidey 'porn' on it tomorrow. As a parent I carefully curate YouTube and won't let the kids watch it on their own.
Netflix "kids" channel is quite good IMO, some of the content - like Story Bots - I really quite like. Could do with more shows with people in though, most of it's animation.
i agree the parent should pick out the content, but the whole auto play at the end of a video situation makes it difficult: is make a good wager that (like when i was a kid) “TV time” is when the parents get things like laundry done, so they have to come back right at the end of a video to pick a new one. if it couldn’t play without the parents consent, the kid would at least yell out.
it’s not about parents being lazy (or hell even if it is, parents need a break too) it’s about creating a user experience that caters to the highly stressful life of being a parent. that’s what youtube kids should be!
At some point you need to hold corporations responsible for their negligence.
Oh, you sweet summer child. TV and film would love you to believe that, but today product placement omnipresent. Michael Bay's Transformers movie is basically an ad for General Motors occasionally interrupted by robots. It's not enough to jam product placement into every inch of cinema being produced today. They now also go back retroactively insert product placement into older productions.
- Knight Rider 2008 was basically a long-form Ford commercial, featuring not just the Mustang, but a Mustang that could transform into... any other model of Ford automobile. Ford commercials also played during the break.
- The infamous attempt by Microsoft to get the phrase "Bing it" to catch on, by paying shows like Hawaii Five-0 to use it in dialog during the show.
- My personal favorite example was Chuck, which dodged cancellation once or twice specifically because of the extent of it's product placement partnership with Subway. It was so blatant that the showrunners made detailed descriptions of Subway sandwiches during the show literally a running gag.
Coca-cola has already had a smack on their fingers for using "influencers" whose primary audience are kids on their channel .
This might hit google directly if enough of this stuff happens. The end result I hope is that videos with kids are forced private so the parents can't use them as props to make money. It's not like the children have any say when their life is being spread out on the net.
The US which is the main base of google has unfortunately not ratified the convention on the rights of children  but in article 16 of this treaty  it is stated that children have a right to privacy.
 https://treaties.un.org/doc/Treaties/1990/09/19900902%2003-1... page 37
Except this isn't true at all. For one, they run ads over the top of shows (which is super obnoxious), but at least they tend to be for other shows rather than products.
But even decades ago, shows like Transformers and Ninja Turtles were really just toy advertisements (literally, not being cynical). How do you distinguish that from this Youtube stuff?
The same would apply to let's say a documentary about TV advertising. Sure, it's going to contain ads in it, but it's ads you actually want to watch because they're part of the content you wanted to watch - a documentary about TV ads.
Before that it was the same way in radio. I still have old recordings of The New Adventures of Sherlock Holmes, and it would open with a guy from Petri Wine or Blue Coal “meeting” Dr. Watson. I can still remember their latter, “Petri took time to bring you good wine.” “Blue Coal, the finest anthracite money can buy.” They’d have an ad break midway and Watson would chat with the company man about the merits of the brand, and maybe exhort people to buy War or Victory Bonds.
This isn’t new, advertising has had its hand up our collective bungs for longer than any of us have been alive. We should fight it all the harder now that for the first time we can block their poison without blocking content.
Whose idea was this? When did it start? I've seen this mostly on American TV shows for the past few years. It completely breaks the immersion even on torrented content where the ads have been removed. (It's impossible to follow anything on regular TV with the multiple ads interruption.)
It's like they really really want you to only watch shows on Bluray or streaming.
Star Wars, at least, began story first... and remained so until George Lucas realized just how much money could be made with merchandise and toys. Star Wars also predated all of the above cartoons so it didn't have that model to draw from initially.
When I first started using the internet, AIM was still all the rave. I'm not sure exactly how but I got into the habit of joining random chatrooms and messaging random people on AIM. As a minor I was under way more threat there than I think kidfluencers on YouTube are these days (..so many 'asl?' comments). The major difference was that advertisers were not on AIM.
Eventually I got deeper into the rabbit hole and started to see the underbelly of the internet, things were much more of a maze back then. Even today, knowing all I know about how to find the worst of the worst online -- I mainly stick to huge platforms, anecdotally it's a lot harder to get caught up in the maze online than it used to be. Maybe that's a good thing, everything is out in the open for these platforms to mitigate.
> It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.
Back when they were little, YouTube was great; my daughter taught herself to finger-spell off videos, and I'd happily credit it for her precocious reading. The worst of the algorithmic BS back then was the busty "reply girls" on every popular video. It's a lot scarier now.
Have you considered maybe going "old school" with the devices? Like, get them an old home computer (Apple, C64, TRS-80, etc), with all the trimmings, and let them play with that?
Or get one of the "newer" retro computing platforms (which you may have to assemble yourself) - such as this one:
...and let them learn with that?
Even an Arduino or RasPi could work; basically, give them a platform where if they want to do anything - they have to learn to do it themselves.
Maybe that's just my nostalgia and naivete being channelled. I grew up in the 1980s with a home computer plugged into my TV in my bedroom (and later got a modem and phone). There was something special about it; I wish today's kids could experience the fun and sense of discovery that made it that way.
I have, but I've largely dismissed the idea.
When I tinkered with an Apple II, it was top of the line - Oregon Trail was amaaaaaazing. I didn't have stuff from three decades into the future to compare it to.
My kids know powerful, easy to use computing devices exist, so building an animation in Logo or something isn't compelling. They use their iPhones and Xbox daily to do cool, tinkery things - my son builds complex circuits in Minecraft, for example.
I experienced the "back in my day" thing with my dad a bit - he wanted me to learn to take apart engines, and it took a while for him to realize I was doing the same with code. I think a lot of folks are doing that now with the next generation and modern devices - missing the point. Chances are they're doing something creative with it, even if it's not the same creative thing we were.
I agree just a smart phone is probably not worth a lot. Along with many devices, though, it can add up.
She is not allowed to use it during school but it's in her bag for emergencies. I'm also a bit curious of how long it takes until she asks for a "better" phone but she also has a nintendo switch so the games part is covered there.
She is also not using it without supervision, the same goes for her laptop. My wife or I sit next to her.
When he started talking about what he saw in the videos as something he himself did, we decided to pull the plug. That was scary for me as a parent. Too much stimulation for a kid, and way too much exposure to things his brain isn't ready to handle.
Added this to my home PC's hosts file for now until I figure out a better solution. Facebook was just a "while I'm in there" addition in case he gets the urge to try it.
This has been the only way I have found to limit viewing for kids to specific videos.
Can we please have comments more substantiated than "Ban them all"? I remember when HN actually had interesting discussion, and wasn't just a peanut gallery. Comments like these are incredibly uninteresting, and only contribute to HN's increasing reputation as a reddit-esque echo chamber.
@dang I'm hoping you can weigh in here, I'd really like to know what the mod team thinks about these kinds of comments (if they're worthwhile to have on the site), and I'm sure you're a more authoritative source on if they're becoming more common or not.
I don't want to pick on floren personally, though. That account has a fine posting history and seems as concerned about comment quality as you are (e.g. https://news.ycombinator.com/item?id=19039487). The problem is rather that most bad-for-HN comments are unintentional, at least once you scrape out the bottom of the barrel.
As for HN's reputation vis-à-vis Reddit, I think that's been pretty stable for years now.
I can't make a comment asking every single fluff comment to please elaborate a bit more just so people go "oh, huh, yeah this shouldn't be here". And you can't deny that in the last N years, HN has become more and more happy to upvote fluff.
1. The idea of a split between platform provider (ie hosting) and doscovery (ie promotion) is an important one
2. Google's PageRank assigns "juice" to a trusted domain - but youtube and facebook break that concept and so google (or all search engines) need to have clearer ways to define "publisher responsible for content that gets some recommendation juice"
2.a. By splitting the hosting and recommendation we can start to see different curation approaches - this dark maze of youtube recommendations (all I get is more Marvel) could be repacked with different curation algorithms - and the more data different AI has to share the clearer the dark / light patches can be seen by researchers.
3. Paid ad disclosure is a simple one to solve - we do it on TV all the time. Quirky home videos are great - but once you have ten million followers you are a business and can afford the regulation
4. Yes paedophiles do spend a lot of time watching kids videos online. We can use unusual watch time patterns to spot this, and that's good - but really paedophilia has been a problem for 10,000 years and we as society need to find wider bigger ways to tackle this - along with medicine this will be a huge win for social ROI
5. In short regulation is coming to all these platforms - but agreeing international rules for same platforms is going to involve amazing new levels of international co-operation and questions of sovereignty- let's try not to Brexit the lot up.
What is the law that is being broken here? Why is it a problem that someone (pedophile or not) is watching a lot of legal content (that happens to feature kids)?
but the main point I am making is that these platforms all have significant pressure to "tackle paedophilia" - and while they do offer a novel approach (there was a reference recently in wired showing tracking the time spent watching family videos by people wildly outside of the family network (ie usually a video of a kid dancing in a tutu is seen by 30/50 people. When views hit 10,000 there is something to investigate)
But in the end "tackling paedophilia" is not something we can offload to the AI in youtube - it's a massive social cost and so a massive social investment.
I am unreasonably disappointed about not seeing actual infancy metaphors in your comment.