Hacker News new | past | comments | ask | show | jobs | submit login
Nestle, Disney Pull YouTube Ads, Joining Furor Over Child Videos (bloomberg.com)
255 points by nopriorarrests 64 days ago | hide | past | web | favorite | 300 comments



>On Sunday, Matt Watson, a video blogger, posted a 20-minute clip detailing how comments on YouTube were used to identify certain videos in which young girls were in activities that could be construed as sexually suggestive, such as posing in front of a mirror and doing gymnastics.

I looked at some of the videos that appeared in his report and it's basically family videos of young girls, the same kind of videos my own sisters would shoot with our dad's camcorder. His first example are videos that come up when you search https://www.youtube.com/results?search_query=orbeez+bath.

I'm sure these videos do attract some weirdos in the comments, and in the next paragraph Youtube says that they want to moderate that activity, but what else do you do here? Ban kids uploading a video of their innocent pool party because some creeps might enjoy it?

The youtuber's video includes the provocative phrase "Sexual Exploitation of Children", but I only saw harmless videos of kids having fun when I looked at his own examples. He also focuses a lot on the comments themselves which is a much different argument.


> it's basically family videos of young girls, the same kind of videos my own sisters would shoot with our dad's camcorder.

Your family camcorder videos probably weren't made available for checkout at the public library, filed under "young girls doing gymnastics" in the library's card catalog. If they were, I imagine your family and many members of the public would be creeped out or upset.

> but what else do you do here?

Require (or maybe automatically force, if implemented well) those videos to be set to Visiblity: Unlisted. I don't think anyone would be upset with sharing these links with family.


Not sure that analogy makes sense. YouTube makes it pretty clear you are publishing your video to the public when you upload the file. You actually have to confirm the video is going to be public. It's not like YouTube is raiding the family archives for the content


> YouTube makes it pretty clear you are publishing your video to the public when you upload the file

After working in tech for long enough, you realize that no one reads anything, even if its absolutely clear in the UI. And they will blame you for it.


Yes, indeed! : Myth #1: People read on the web https://uxmyths.com/post/647473628/myth-people-read-on-the-w...

I always have a hard time explaining to people carefully writing pagewalls that throwing out 90% of it would actually be more effective...


As well they should, because (the hypothetical) you built and deployed something that relied for safety on users doing something you knew they wouldn't do.


Non-rhetorical question: how would you build this feature? What if the user lies about their age to avoid COPPA? What if the user uploaded 100 videos in the past, but didn't want to make this video public? Should we bug them every single time to verify if they wanted to make the video public? How many times should we bug them about it?


So in your world, we should just never give users nice things because we know they're stupid?


No, you make the default safe and make the "dangerous" option hidden behind an advanced menu, so at the very least you can be certain the user can read and follow directions. Like a two-stage weapon switch.


What if not reading those things is smart as almost all of it is crap? Then you are banking on them being reasonable but stats still in your odds.


The tools you give them should come preconfigured to stop them hurting themselves, because you already know they will if you don't. It's not hard to go from that knowledge to questioning what it means if the defaults are harmful.


This. Public videos should t be default. By I assume, this change would bring YouTube a lot less eye balls and hence revenue.

At the end it boils down to the truth that a Corporation is a hive mind, an intelligent machine ruthlessly optimizing for stock price and profits at the expense of sometimes human values.


The whole point of YouTube for m it's inception was to share videos with the internet. This isn't some shady hivemind hatching an evil scheme.


Things are not clear just because they have been spelled out in text. I don’t think this has anything to do with tech, except that tech tends to be full of abstract concepts that are difficult to understand.


It's strange to assume that people think youtube is not a video sharing service.


What people see and viscerally understand is that they 1) upload a video and 2) get a link that they can share with others. Up to this point, it's the same as e.g. Dropbox.

What's not immediately clear is that the video (and channel) now shows up in search results. You won't be aware of that unless you either have a clear mental model of the platform, or try to search for the video title yourself (which implies a certain sort of mental model to be present already).


Upon opening YouTube the visitor is faced with videos published by others. Upon uploading there's one input dropdown right in front and center of the user, by default saying Public.

Sure, people only skim and headline-read articles, but because most articles are shit. That doesn't mean people don't ponder at least a bit when faced with a YT upload page. (And the upload through mobile app is probably more common for non-pro [first time, casual, accidental(!?)] youtubers, there the privacy: public field is pretty visible, legible and explicit.

They can mean anything in people's mind, public is that yeah anyone I send the link can see it, but that's not exactly YT's fault, that people don't make the connection, and somehow live in a fantasy world.


> no one reads anything [...] they will blame you for it.

And I cannot blame them too much. People are constantly bombarded with misleading messages (e.g. ads), clickbait stuff, terrible UIs, indecipherable contracts, countless pop-up notification and warnings, opt-out and other "dark patterns".

(disclaimer: personally, I read contracts and agreements and it's often pointless and frustrating)


Part of the problem is the publishers. They know what they are uploading and want it to be public. You are taking the analogy too far.


I don't think you can safely assert that people know that they "want it to be public". The last decade or so has demonstrated pretty adequately that people's brains have a lot of trouble with what "public-to-the-internet" means.


Sharing information with broad public is the whole point of social networks. People might not want all consequences, but they want some consequences.


No, the point of most social networks to most people is sharing information with people close to you. To witness, Facebook being more popular than Twitter.


I know. I just think it’s somrthing YouTube should implement, because people uploading videos of their family probably don’t intend on complete strangers searching for them and thus probably don’t worry about making them public.


> you are publishing your video to the public when you upload the file

Due to various cognitive biases most people fail to realize just how big and far reaching that public is.

To share an example from my own experience a few years ago. I wrote a blogpost. I love writing blogs and sharing them with The Public. That's why I write it.

In one of them I used an Xkcd comic. My entire audience, anyone I could ever imagine reading it, knows XKCD and loved it.

But then that blog got 100,000 views. Then 200,000. Then it spiraled. Suddenly I was flamed from all sides, called a disgusting human being and all sorts of things. Why? Because I didn't correctly attribute that XKCD comic.

"Yay my friends and their friends and liek 1000 people saw my thing" is a completely different "public" than when THE public sees your thing. Most people imagine the former when they think "Oh I'm uploading this and it's going to be public"


but in this example would you blame wordpress or whatever is hosting your blog?


I learned the hard way that youtube is not useful for sharing family videos with a limited audience (ie: immediate family. 10 people, tops.).

I uploaded my children's dance recital video, as an unlisted video, and earned myself a copyright strike because of the background music.


Copyright strikes are automatic, and you can dispute ones like that as fair usage.


Since when is recording and publicly publishing a performance that includes copyrighted music licensed for public performance as part of a live event but specifically and explicitly not for rerecording and distribution (as would normally be the case for the type of recital described, assuming the dance studio was on the ball with licensing) fair use?


Just to take this a little step further

1) Purpose: to illustrate the efforts of my young children in a non-professional context, not to showcase the audio portion of the recording.

2) Nature: audio + video, with explicit permission to record from the show executive/directors (it was a rehearsal, professionals recorded the actual show) and focus on the visual and physical nature of the unique performance

3) amount: abridged songs, amateurishly cut and remixed

4) effect on potential market: There is no way this is a substitute for any of the original copyrighted audio, and I communicate this intent to youtube by requesting it remain unlisted.

It probably does pass the four factor fair use test, but fighting with the youtube contentid machine is futile, and their system doesn't care about fair use.


It absolutely is fair use. Let's consider:

- the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;

Which is non-commercial.

- the effect of the use upon the potential market for or value of the copyrighted work.

Which is to none.

There's also a question of if a video containing a bad transfer of a piece of music along with children dancing is "transformative." See Lenz v. Universal Music Corp, where Universal Music got destroyed all the way up to SCOTUS for ignoring Fair Usage when taking down a video of a dancing baby.

Your whole argument seems to be based on some fictional contractual agreement between the dance studio and the music's owners. That isn't relevant to the discussion at all, a parent recorded and distributed, not the dance studio themselves for commercial purposes.


While I take your point, the very act of uploading the video is akin to "going down to the library with a copy of the tape for anyone to rent" in this analogy, which does mitigate the degree of upset to an extent.


Mixing some metaphors here, but I've encountered a lot of people who view it like going to get your photos developed to share with family only, then finding that other people can order copies too


I mean, there are probably people who buy DVDs of gymnastics events to get their jollies on it. Even in the VHS days, there were pros that would video competitions for parents and kids to buy if they wanted. It's not a new idea, the medium is just new.


Ok, but we have new tools and they can be used to counter this unproductive negative trend. It’s not as if Google isn’t a leader in ML, what have you.


Those same parents might take a different view of the pro assembling an edited lowlights tape.

Which seems very close to crosslinking comments with timestamps on fully public videos, with advertising.


You're assuming that these people don't want their videos public. Even if it's true for some, clearly some of these uploaders do want their videos public, so your solution doesn't generally solve the problem.


> Require (or maybe automatically force, if implemented well) those videos to be set to Visiblity: Unlisted. I don't think anyone would be upset with sharing these links with family.

But, I mean, this takes away the use case of Youtube as a replacement for America's funniest home videos.

Not that that's a necessary thing for societty, but it is something to think about. The chances that some creep found AFV stimulating is non-zero and advertisers paid that show to have their ads. The only difference here is the openness of the comments, whereas those AFV creeps probably had to communicate in secret.

The whole thing is creepy and awful, but I'm not sure what the best solution is. Perhaps banning comments on those videos.


but what else do you do here?

Not attempt to monetize your family videos. Disable ads on your own videos.[1]

[1] http://orgspring.com/turn-off-ads-youtube-videos/


I don't think a lot of parents are. It's more of people ripping (sometimes unlisted) family videos and posting them on accounts with large collections.

The videos themselves aren't really the issue. It's the context they're presented in.


That solves the issue from the advertisers' perspective.

It doesn't solve the "pedos are looking at my kid" side of things.


Don’t use YouTube. Photos on Mac and iOS make sharing family videos dead simple. Not sure why anyone uses YouTube to share videos they don’t want to be public.


Nobody in my family has a Mac on an iOS devise.

It's possible to have family videos private on YouTube and only share them with a small circle of people.


> Photos on Mac and iOS make sharing family videos dead simple

That’s categorically false. Every iCloud user has their own photo library and you cannot create a collaborative one. So all my photos are in mine, all my wife’s are in hers, and we’ve resorted to uploading to a a joint (also paid) Google photos account for sharing with each other and family.


Shared Albums exist and they are collaborative. https://support.apple.com/kb/PH13690?viewlocale=en_GB&locale...

There's a quality downgrade so it's not suitable for sharing archive-quality photos, but for us it's adequately replaced Facebook for sharing photos and videos of our kid, vacations etc


However a shared album requires one to manually, selectively choose photos to share with people, as opposed to sharing all your photos with your spouse, which Google makes easy.


I thought the context was in contrast to sharing as unlisted videos on YouTube (the comment you were replying to)


Mac and iOS are prohibitively expensive for a large fraction of users.


I think the discussion is important, because something needs to be done. It's a scary situation going on.

Yes, the videos are kids being themselves, and typically uploaded by the children. However, then these predators show up pretending to be other children, and they talk these kids into doing more explicit and inappropriate things. They ask children to film a video sucking on a lollipop, they ask them to make a video showing how to do the splits, or they ask them to make a video playing twister with their sister. I even saw them asking children to do the "toothpaste challenge", where they try to get them to fill their mouth with toothpaste so it looks like cum. If I start typing "toothpaste challenge" YouTube autocompletes "toothpaste challenge drool", "toothpaste challenge tongue", "toothpaste challenge little girl", "toothpaste challenge girls".

These kids don't know any better. They make 50 videos that all have 50 views, then they make a yoga video and it gets recommended over and over again to predators, and they end up with one million views. Then the above situation unfolds where they're requested to make more inappropriate content, or the children realize their gymnastics videos are by far the most popular, so they make more, and more. They think they're making tutorials for other children, but the other side of the camera is just thousands of predators watching and pushing them to go further. Then they start trying to trade contact information with the children to get them off the site.

I don't know how you fix the problem, but it is a problem. We're talking about videos with millions of views here. It's not one or two people trying to exploit these children, it's literally hundreds of thousands, and I think the kids need some help protecting themselves.


> However, then these predators show up pretending to be other children, and they talk these kids into doing more explicit and inappropriate things.

So instead of trying to ban the videos and restricting user freedom to prevent the videos and deploying automation to recognize them, why don't we ban the predatory behavior and set system rules to prevent and/or direct AI at identifying that.


Blocking comments only solves the direct contact between predators and children.

As I said, some of these children will make 50 videos with no viewers, but then they create a gymnastics or yoga video that goes viral with predators.

Even if the comments are disabled, these children realize what subject matter grows their subscribers and view count the fastest, so they keep producing more of those videos. Then another kid sees their channel, and notices their yoga video got one million views, so now they try making yoga videos to replicate it. To their surprise, they also gain traction, and the cycle continues.

So, even without a single comment, you still end up with thousands of children making tutorials on how to do the splits, while they cannot begin to comprehend what is taking place and how they're being exploited.


> Blocking comments only solves the direct contact between predators and children.

I'm pretty sure the phrase I used was “predatory behavior”, not “comments”, so I'm not sure what your point is.


Sorry, I assumed you were referring to comments. If not, what predatory behavior do you want them to identify and ban?


Comments are the only visible indication of the predatory behaviour, therefore they are the only input any moderation process can use.


> Comments are the only visible indication of the predatory behaviour

Strange, because the post bugging the limitation of blocking contents identified other visible manifestations of predatory behavior.

Admittedly, they may be ones that are difficult to distinguish in a single act from acceptable behavior, but there is no reason an automatic detection system needs to consider each action independently in isolation.


serious question: are there enough predators to influence the algorithm? what you're describing from the "yoga indicent" sounds like an anecdote?

edit: holy shit, i just watched the video report. YT in serious trouble.


I'll tell you why, and it's going to be unpopular: Money. Millions of views === money, no matter WHAT or WHY it's being viewed ( remember that many of these videos are monetized ). Do you seriously think Google encourages their developers to develop systems which will reduce viewing?... The only way I see this happening, is under duress, i.e. profits suffer because of bad publicity, or similar social pressure.


As much as I'm not a fan of "big govt" in many cases, I hope there is legislation that prohibits parents from sharing photos, videos, etc. of their kids publicly (e.g. a public IG account, a public YT channel) without consent (e.g. meaning the child would have to be of an age where they can even legally give consent).

Children have the right to privacy, and it is crazy how many parents do not think twice before sharing.


I feel sorry for kids these days, thinking about how videos and pictures (and plenty of embarassing ones at that) are plastered all over social media by their parents with them having no ability to ever exercise any control on limiting the sharing of the videos and pictures once they are older and potentially bothered/annoyed about it all.

That being said, it's hard to really regulate this. When someone has a baby, they usually share a photo in a public manner of the entire family. It's hard to argue that parents shouldn't be able to share any pictures of their family in a public manner. It also would then prevent journalists or anyone for that matter, from taking pictures in public spaces that contained any minors... which starts to get absurd.

Hopefully my kids will appreciate that we never posted pictures of them publicly on social media (only shared privately within immediate family). At the very least, it's one less thing they'll have to worry about as they grow up.


Ironically the first time I’ve seen this point made is in an Onion video - 6-Year-Old Explains How Messed Up It Is That Her Entire Life Has Been Put On Facebook https://www.theonion.com/6-year-old-explains-how-messed-up-i...


>Hopefully my kids will appreciate that we never posted pictures of them publicly on social media (only shared privately within immediate family). At the very least, it's one less thing they'll have to worry about as they grow up.

The trick is to live like it's 1980. We have many, many pictures of our child, but only 10-15 on social media (of things like winning sporting events or the like). Everything else is in digital form on drives in our home server or printed in physical frames. If/when the child decides they want to do something with their photos, there they are.

Also handy to have on the home server because we can turn any screen into a digital frame on our network. That will be great when they bring their first boy/girlfriend home.

People just feel the impulsive need to share literally everything on social media. Everything is a competition for likes and attention. It's disgusting.


Yup. We still print out an annual photobook that we spend time making with all our favorite pics/memories from the year and the kids love going through the picture books every so often. There is something very different in the way they experience a physical picture book vs. viewing the images on a tablet/computer.


As a parent you should be able to share to a private FB, a private IG, a private YT account. That's fine, because you presumably have control over who sees those photos (hopefully people you actually know).

I'm also not talking about someone sharing a photo publicly with someone else's kid in the background.

I just mean as a parent you should need consent to share a video of your kids to a public social media channel. If your kid cannot give consent because they are underage or refuses, then you shouldn't be able to share it.


> As a parent you should be able to share to a private FB, a private IG, a private YT account. That's fine, because you presumably have control over who sees those photos (hopefully people you actually know).

I'm afraid not a lot of parents/people would be willing to pay for such service, so I'm unsure this is going to happen any time soon.


Many of these videos are uploaded by the kid themselves, no? Above 13, you're allowed to have your own account and upload videos. Even if the parent uploaded videos are removed, it still doesn't fully solve the problem.


Many governments have made clear that children are unable to consent on their own to many things (the "Age of Consent" itself comes to mind), and before the age at which they are able to consent, their parents act as a steward of their consent. It does bring up an interesting line of thought though, which is that if we legislated that children (once adults) can consent to the release of their likeness/recordings of themselves, and they DO consent to sexually suggestive content as described in this post, can they also not consent to release fully nude/sexual content of themselves from when they were younger? Where along that line do they lose the ability to make informed consent decisions? I would also note that children also do not seem to have an expected right to privacy from their own parents, per cultural norms.


We have not reached consensus on the subject of whether bad images of children are bad because the child cannot have consented to the production/spreading of the image, or simply bad inherently in and of themselves - even within most jurisdictions, never mind across them; however, consensus is currently swaying towards the latter - more and more jurisdictions are leaning towards outlawing e.g. drawings of imaginary children.

In practice no politician will want to be on record as the cause of child safety ratchets being relaxed; and so there is no realistic path from the current environment to one in which a person may become free to act as they see fit with previously problematic images of themselves on reaching majority - the trend is towards such images being seen as problematic purely based on content, regardless of context.


Wouldn't that also ban child actors? Also, a lot of famous Youtubers like Mark Rober show their kids or newphews playing around with their latest experiment. Most people would consider banning these videos reactionary.


I think you could apply for exceptions like you would for child actors (e.g. whatever law allows them to get around child labor laws)

Point is there should be a way to limit public sharing of photos and videos without a child's consent. There should be a good reason, beyond a parent's own narcissism.


> I think you could apply for exceptions like you would for child actors (e.g. whatever law allows them to get around child labor laws)

There is an application in CA for an entertainment work permit, but that's mostly validating age and satisfactory education status (for school age children) and medical clearance (for infants under 1). So, I think the analogy doesn't work and you should work out what kind of application you mean here.


As disturbing as I find this situation, I think that it is a bad idea to force companies to block photos and videos of children to be uploaded.


"Ban kids uploading a video of their innocent pool party because some creeps might enjoy it?"

Given everything else YouTube already does, I seriously, no sarcasm, don't see this as a particularly problematic solution. As ceejayoz pointed out, there's some thorny consent issues involved here: https://news.ycombinator.com/item?id=19209766 (ceejayoz' point isn't much larger than my summary of it, but I wanted to give credit. I think it's a good point.)

Sometimes scale brings things like this. I have to admit giving a smaller, specialty service a niche to exist in doesn't exactly bother me, either.


I upload unlisted videos to share with family. I batch and upload them on a monthly basis. Never had an issue with Youtube and my family videos. Last month one video was my daughter fully clothed in the bathroom dumping a cup of water on my son in the bathtub and both laughing. You could only see his head and chest. It was 10 seconds long and funny play between young siblings.

It was instantly and automatically flagged, removed and a strike placed against me for going against community guidelines. I had 40 characters to appeal, but it is automated since the response was instantaneous. There are other videos of kids dumping water on each other ... even when both are in the tub. An email asking what guideline I violated to youtube support has gone unanswered.

To me, if Youtube is automatically, instantly flagging my video, there is no reason they cannot automatically flag and remove the videos in Matt Watson's video, especially if they are public and are attracting a wide range of comments.

Don't get me wrong, it's Youtube's pool, so their rules. They can choose the content they want.


There seems to be something wrong with this argument. You're displeased that their algorithm is too sensitive and flagged your content as a false positive, right?

But then you think they can apply the algorithm more broadly? Wouldn't that just risk many more false positives and irate users?


>You're displeased that their algorithm is too sensitive and flagged your content as a false positive, right?

No. As I said, their rules. I have no idea if it is a false positive since I cannot get an answer as to what guideline was violated. I could have violated one of their guidelines.

Youtube doesn't owe me anything ... I use their service for free with unlisted videos to share with family. They make no money off the videos.

My point was if they can flag something like my video, I am sure they can flag the ones given as examples attracting the other people.

I can understand your point that false positives would generate more upset content creators who do make their videos public and would like ads run against them.


You point out that there are similar videos in the system that aren't getting flagged, which makes a pretty good argument that the algorithm is random and ineffective.


No. He’s arguing that if the algorithm is capable of detecting his video, it is capable of finding others like it - so if google wanted to remove such videos categorically, it can and currently chooses not to.


They sure can. They removed a video of mine for copyright because of background music in the environment where I was shooting.


Which is a completely different, and rather trivial, problem compared to "automatically detect the age of people in a video and/or if they are doing something sexual".

The first is a mere process of comparison, while the latter requires quite a few levels of subjective abstraction.


> copyright because of background music

I feel you. Many of my videos have been flagged for copyright violations due to background music, but since they are unlisted and only for sharing among family, I'm ok with it.

There have been a couple of times they have muted a whole video for a snippet of music, but I've never had a video removed for music.


The problem is that your videos are unlisted, so Google isn't making enough money from them.

If you posted them publicly so Google could properly monetize them, they'd be "Featured videos of the month!" in a pedo group.


> Given everything else YouTube already does, I seriously, no sarcasm, don't see this as a particularly problematic solution.

Isn't this punishing the users more than punishing YouTube? I get how these kind of videos would be banned in a country like Saudi Arabia, but in the USA, or much of Europe? Yes, CP should be disallowed, but kids playing in a pool adequately covered?


"Punishing" is kind of a loaded term though, isn't it? "Not hosting your video for free because of concerns about risk to YouTube" isn't really a "punishment".

As I sort of alluded to already, giving other services an incentive to exist, or incentivizating more self-hosting, I see as a virtue anyhow. YouTube shouldn't be The Video Site On The Internet.


It really is, if you could post videos for free before, but now some people in parliament have made it so you can’t anymore.

Other services may or may not fill the huge created gap, but what’s to say they won’t be subject to the same regulations as YouTube? Anyways, the politicians would probably take a huge fall if they did this, voters don’t often buy “its for your own good” arguments.


Several videos appeared to be re-uploads by people other than the original uploader.

I don't think there's an ideal solution. But they'd need some mix of detecting videos that would be prone to these kinds of comments, then subjecting the comments on those videos to higher scrutiny, such as stricter automatic detection/deletion/*(EDIT - Shadowbanning or hiding the comment as a "first pass" seems good) or even manual review.

Manually checking everything is impossible without severely hampering the freedom and capabilities of the platform, and automatically detecting everything is not possible without accepting a heck of a lot of false positives.

But putting more effort into a mix of the two would be better than what they appear to have been doing so far.


> Several videos appeared to be re-uploads by people other than the original uploader.

This is very significant. It becomes 1000X more sinister here - it goes from "innocently upload family videos to share" into "steal family videos that are popular with perverts, post them yourself and monetize them".

It will be very difficult for YouTube to separate legitimately uploaded family vids that attract perverts from scum who use stolen "innocent" content to pander to perverts.


For one, you could see if there's more than one family bring uploaded. If one YouTube channel has the same 6 children in it across 10 videos, probably okay.

If one channel has a different set of kids every video, maybe not okay.


Youtube could leverage their existing ContentID system so reuploads of videos that are private are taken down and the accounts locked after three strikes.


> Several videos appeared to be re-uploads by people other than the original uploader.

Preventing reuploads of videos that aren't explicitly flagged as eligible for that ought to be a trivial application of tools they already have for copyright enforcement, so that aspect of the problem should be trivial to solve.


YouTube doesn’t have a hard time blocking feature films.


Monetizing mundane home videos with ads is a creepy and bizarre concept to begin with.


Child paegeantry is a multi-million dollar business, and has been for decades.

Everyone is part of the problem, starting from the parents who push their kids into the public spotlight, to the local media that provides coverage of this stuff, encouraging other fame-seeking parents to try to get in on it.


Worse is making children work so they can earn money as influencers. Don't see G lifting a finger there.


It's literally what TV shows like AFV did.


Notice I used the qualifier "mundane." Funny potentially meme videos are one thing, unexceptional videos of kids' talent shows or just doing everyday things should be not be advertised. Ideally they should be unlisted or posted to Facebook or some other social network that isn't broadcast to the entirety of the web.


> but what else do you do here?

For your example ("the same kind of videos my own sisters would shoot with our dad's camcorder") there are two remedies:

1) Upload the video as unlisted, so it will not appear in recommendation systems and you can send the link to your friends/family.

2) Disable comments on the video.

Public content is a bit more dangerous to leave unchecked in 2019 than in 2009.


Sure, but that's an uploader's decision they can already make today. It didn't prevent this drama.

The suggestion here is that Youtube should do something about it, and I'm asking what that solution specifically looks like.


It also seems like people are downloading these videos from their creators' channels and then reposting them under their own accounts.


Better defaults


Cooperate with the authorities to find and investigate people who post creepy comments? You already can't post bomb threats online without getting an investigation.


Bomb threats are illegal. Creepy comments are not.

This may be an awkward position to take in public, but I'm not comfortable with the idea of criminalizing "creepy comments", especially on the internet.


I once saw police notified because an employee was looking at gunbroker.com on a company computer. Looking at listings of guns for sale is not illegal, but it bothered the company enough that they called the police and a few questions were asked.


Comments that cross the line into "grooming" are definitely on the continuum of child abuse, which is illegal.


> Public content is a bit more dangerous to leave unchecked in 2019 than in 2009.

What sort of advances have happened since 2009 that would explain this?


The sad but inevitable movement from open web to a balkanised web. Much of human interaction, both good and bad, have moved into closed facebook groups, discord, wechat and telegram channels beyond the reach of the search engines. Youtube just happens to exist in the present twilight zone between the old internet and new internet.

To me, the seminal event marking the transition has to be IMDB shutting its forums. Public discussion and discourse is now seen more as a liability than something of value.


I have looked into this before and one thing I noticed is that the videos with the creepiest comments tend to be reposts. The videos that actually belong to the kids and families who posted them often have their comments disabled. Perhaps YouTube could do more to delete the reposted videos attracting so much perverted attention.


I think this is a really good idea!


It reminds me of kurt cobain's assessment of those who saw child pornography on the cover of Nevermind: if you see this as sexually suggestive, then you are the pedophile.


>I'm sure these videos do attract some weirdos in the comments, and in the next paragraph Youtube says that they want to moderate that activity, but what else do you do here? Ban kids uploading a video of their innocent pool party because some creeps might enjoy it? What about topless babies?

What about the rights of the children? Maybe they should have the right to wait and decide, upon adulthood, whether they want to share these videos rather than have their parents blithely allow weirdos to leer at and comment on them.


Why just children though? Why not everyone? If someone uploads an embarrassing video of me without my consent, I don’t see why it would matter whether I’m a child or an adult...


Well then you also need to make sure you don't take them outside. Especially not to a swimming pool, or let them participate in gymnastics. There could be creeps present, thinking bad thoughts.


Internet brings this to a massive, billion+ sized audience, though. Say the rate of pedophilia is 1 in a million (I just made that number up). That means there are, on average, 0 creeps watching your kids at gymnastics meets and pools in real life. However, upload as a public video to youtube, and like flies to honey, hundreds of creeps will gravitate toward the video.


Well I can see the number of views. They aren't high. The one that got ~700 views was probably because the title said something about "upside down on monkey bars," but she was wearing jeans, so I don't care. I doubt anyone was gratified.

Pretty sure none of the videos I have posted have a billion views.

And a creep in real life can easily follow you home.


The actual rate is somewhere between 2 and 5 in 100. It's quite high, estimated to be more than 10 million Americans.


Do you have a source for this? That seems alarmingly high.


That does seem really high. Then again, it's hard to measure because you can't exactly send out a survey asking people to admit to being a pedophile.

It wouldn't surprise me though - perhaps sexual variants just tend happen in that rate window... chemistry doesn't care whether the particular variant is ethical or not.


> there are, on average, 0 creeps watching your kids at gymnastics meets and pools in real life

I suspect "people spectating at gymnastics meets and pools" is a not a random sample. Some will be relatives or friends of the participants, but who are those others?


The recommendation algorithm should distinguish between viewers who happen upon a video like this once in a while and viewers who binge-watch them, and disregard the latter when putting together input for the algorithm that determines what shows up in people's "Watch Next" sidebars.

They could also detect links to compromising timestamps in the comments, take them down, and shut down the accounts that post them. It's not particularly challenging.

(I used to work as an engineer on YouTube's abuse team many years ago.)


What do you do know? Just curious [=

That's a really cool idea.


One thing my partner and I disagree about is taking pictures of our kids in the bath or paddling pool, on the basis they might end up in the public domain somewhere.

My personal opinion is that it's the other guys problem, if they want to sexualise images of my kids, if I have to self sensor, we've already lost. But as my partner points out principles don't count for much if you've got pedos trading pictures of your kids, so.....?


I don't think _taking_ photos should be a problem, but what do you do with them afterward?

I see people posting photos and videos of their children in public Instagram accounts all the time, and I think that's a terrible thing to do. Not because creeps might get off on them (though of course that's bad), but because it shows that the parents -- who have been entrusted to make decisions on behalf of their minor children -- are showing a distinct lack of respect and care for their children's privacy.

I don't have kids, but if I eventually do have any, I will never post photos or videos of them on IG, FB, YT, etc. I might privately share some things via better-protected, private-by-default channels (email, messaging apps, possibly private sharing on something like Google Photos), but that's it.

This isn't about self censorship; this is about understanding what these public services are about, and making responsible decisions about what I do and don't post.


Some posted a video of a guy who was making this same sort of video, but about a 13 year old doing ASMR.

That's actually creepy.

I don't think we can stop pedos leering at kids online. In person you can tell them to get lost or call the cops.


I saw the video you are talking about.

The worst part is that I disagree with his claim, or rather I don't think he makes a strong argument that these videos are _intentionally_ sexual. It looks to me like a kid doing kid stuff. I might have made the same videos at her age _but_ it is still sexualized by some people.

I am not sure what's a good solution to that could be. According to these new articles, it looks like any video showing a kid is potentially sexualized.

Banning all kids on youtube look a bit extreme and would raise its own set of problems. And at this point, just having a kid in any movie/series has the same consequences

So yeah .. not sure we can stop that. And if you are sharing timestamp of kids doing squats, you are probably too far gone that just asking you to see a psychiatrist is enough.


Yeah, whoever made those articles are the ones sexualizing innocent videos. Makes you think...


Okay, can you reword that? What is wrong with kids doing squats? If nothing, then, what is wrong with timestamping kids doing squats?

The only thing I see wrong here is the sexualization, but it'd be us doing it if we assume the reason of the timestamp. Compare: Video of people playing soccer, someone adds a timestamp to a part when a guy passes the ball to another guy, would you think the timestamp was created because of sexual arousal and tell they need to see a psychiatrist? How is kid doing squats different?


It's about context. The context establishes the intent.


My problem has been with people assuming a context, and an intent. I keep repeating myself, but a comment on a cat playing with yarn ball video linking to another cat playing with yarn ball video doesn't seem different to me than comment on a girl eating lollipop video linking to another girl eating a lollipop video.

These videos exist, people like to watch them, but I don't see people assuming the intent is for people that have some cat with yarnball fetish to share it with others that do, as the context. Nor were the children eating the lollipops intending for anything else than having fun. There's nothing wrong with either, other that people assuming why people upload videos, watch them, timestamp them, or share them.


But its not just the link, its also the comments that go along with it. One example was "this is why you're here" with a link to a timestamp in the video. That context is unmistakable. There are certainly less obvious cases though.


Please search for my reply on the "this is why you're here" case on this page for more info. Basically, I claim that to reach such conclusions you need an imaginary pedophile in your mind that gets aroused by the material (but not the entire thing, only the timestamp). Then, and only then, this product of fiction makes that context unmistakable.

In reality, I could have in my mind someone that gets aroused by... maraccas (I don't know, it's the most mundane thing I could think of). There's a maraccas video, and there's a comment with a timestamp that says "this is why you're here", so in my mind, there's this group of people that get aroused by maraccas and this comment is for them. The intent and context is clear. All this fabrication is not less real that yours.

Chances are there's a guy out there that actually matches the pedophile in your mind, however, they're not wasting their time clicking timestamps on youtube. If you research porn, people get adapted and need worse and worse things to "get on". The actual pedophile stopped being impressed by such timestamps content long time ago.


I hear ya. It’s one of those things that are like Janus and are two sided.

But imagine: store employees make a peep hole in dressing rooms. Sure parents help their children change, depending on age, it’s all innocent, but an unsupervised uninvited stranger is a little different, right?

Same with these kids’ videos. Now of course a good question is who’s uploading, thd kids or the parents? Parents should know better than make family vids avail to all, Google should do a better job of figuring out if a kid’s old enough to upload if the parent isn’t involved in this.

They may even default to “private”/unlisted if they can auto determine the uploader is a kid, and advise parents if content, though innocent, contains minors (this should be default for parents anyway because although they have right to make decisions for their kids, they should think hard before putting their kids out there on the internet, if nothing else for the privacy of their kids')


>but what else do you do here?

I don't have an answer, but considering the size of Disney and Nestle cumulative marketing budget, youtube engineers will be hard pressed to do something.


Youtube is not facebook, and judging by the continued neglect of youtube's comment system I doubt their ability to add fine grained privacy control, especially since they abandoned Google+ integration.

This isn't a new problem as their comment subsystem has been a usability nightmare in years. When a comment thread geta to a certain length, following the conversation and pinging specific replies become increasingly difficult. The late John "Totalbiscuit" Bain had to turn off comment on all his videos and direct all discussions to Reddit, because once the comments start to get rowdy it's impossible to moderate.


YouTube tried to fix their comment system by forcing people to attach them to Google accounts with real names. The anonymity is part of the problem (look at how much banning Reddit has had to do due to advertiser pushback. It's not an open platform at all. Voat and 4chan are good examples of what anonymity leads to).

Anonymous comments are always going to go down that path. The way Facebook slowly released and tried to keep all accounts real has contributed to the way it's network works. We see other sites trying to do this like OKCupid which tried to force people to use their real first names.

I personally think there is something that's lost when you lose that anonymity. There is a reality people let themselves be, and there's something valuable that can be learned about society which is overshadowed but just calling comments a cesspool.


Both a usability nightmare and a sanity nightmare. The content of the comments there is about as bad as facebook's. To be honest if they removed the ability to comment then nothing of value would be lost.


The opinion people have of youtube comments says a lot about what sort of videos they probably watch. You hear the sentiments echoed by the creators that run different sort of channels too.

Some people think youtube comments are complete trash. They usually watch videos of very large channels. Other people think youtube comments are dull but not particularly horrible but rarely insightful. It's common to hear this about channels with a million subscribers or less. But below 100,000 subscribers, in the long-tail of youtube, you'll often find that content creators enjoy having reasonably constructive conversations with their viewers in the comments.

There are always exceptions, but these are the general trends I've noticed.

Examples of the first category, the very large channels, are easy to find. Pewdiepie, and anybody like him, have awful comment sections and I think everybody knows it. I subscribe to several very small channels with only hundreds or thousands of subscribers that have great comment sections, but you'll have to excuse me for not sharing them here. In the middle range, I can give you a few examples. Matthias Wandel (woodworking) and Forgotten Weapons (history and overview of rare/historical firearms) both recently passed 1 million. Their comment sections are rarely insightful, but typically not offensively bad. Mostly just people making the same obvious joke over and over again, or people trying and failing to be insightful.


God help the woman who tries to do something like your woodworking or firearms channels, especially if she is attractive. Maybe all those commenters are 12 years old, but I doubt it.


I'm not sure; to be honest most the women I'm subscribed to are in the sub-100k range (I think all are sub-200k.) I think those comment sections are mature and constructive more often than not. Within that range, I still notice a general trend of comments being better when channels are smaller. The curve might be shifted, but I think the trend line probably has the same shape.


Comments on youtube are far far worse than on facebook, imo. Hit up any tow truck channel and you'll see overt, unhindered racism, for hundreds of comments.


It’s just that you don’t agree with it.

There is a big chiasm between censored internet (Facebook, newspapers) where everything is polished but only reflects 25% of the population, and free internet (the rest) where you get an actual image of diversity. It’s just that you don’t like this diversity and you want them not to exist, but their points are, if not correct, at least vastly unadressed because the visible society despises them.

I defend men’s rights. All of MR forums are underground. They’re not happy, they’re unadressed and despised by society. And that’s 49% of population. Saying men’s rights are a non-issue which should not be adressed and deserves the censorship it has in the above-ground internet is, by nature, discriminatory towards men. It’s just an example, but you won’t be able to keep the pressure cooker very long.


I appreciate that you care deeply about this issue. I hope this means you are open to me challenging your points:

>It’s just that you don’t agree with it.

Correct, I don't agree that it's ok to refer to groups of people as "jobless apes," and I despise the mindset required to dehumanize like that. I find it disgusting. I also find it irrational and dangerous, and I want it to go away. Is this a "just"? I am happy to dismantle any "advantage" of a racist mindset, if you'd like. It simply does not need to exist - the exist of racism is a failure of society. Every member of society, I believe, has a responsibility to make it Go Away. That includes youtube.

> but only reflects 25% of the population

I am surprised to hear you suppose that 75% of the population is, for example, racist. I strongly reject this hypothesis but welcome your evidence.

> an actual image of diversity. It’s just that you don’t like this diversity and you want them not to exist, but their points are, if not correct, at least vastly unadressed because the visible society despises them.

"just," again, implies that my dislike of this "diversity" and my desire for it to not exist, means it should? Racist points should remain unaddressed - they are wrong. They deserve no more attention than someone saying "there is a dragon in my garage, you may not see it but you must worship it with me" - riddled with fallacy and taking no energy to generate, these ideas and concepts must be rejected with as little energy as it took to generate them, or we open ourselves up to "death by bullshit." A DDOS of bullshit.

Society should despise many of these concepts. Racism is stupid, pointless, a waste of energy, and immoral. That someone that is racist gets despised by association is a shame - I wish we could all be Dale Carnegie. Punish the sin not the sinner.

>And that’s 49% of population

I reject the notion that all men are Mens Rights activists - that idea isn't even kosher by Red Pill values, because "cucks" exist in their value system, which obviously aren't men's right's activists. Furthermore, my own experience disagrees with the supposition - I have met thousands of men, and about two men's rights activists. On the internet, I only find them when I hang out in 4chan or the boards made for them on reddit. 49% of the population is "underground?"

The phrase, "men's rights activism" does not draw meaning from the words alone - the definition encompasses more. I posit that equating the censoring of Red Pillers is not equivalent to discriminatory actions towards men in general.


It's not too bad for videos with a smaller (<100k) viewer count or just not of general interest. The comment system seems to cope well and people who comment by this stage is usually interested in tje topic and have something constructive to say.

Once the number of comments reach a tipping point it's the time to bail out.


So his examples might be quite tame and understandable. But there's a lot of others that aren't there's a reddit community that's been documenting channels that have these kinds of videos, https://www.reddit.com/r/ElsaGate/

EDIT: some of the worst I've seen are usually involving Micky or Minnie mouse doing drugs and other things, while containing a bunch of keywords intended to get into recommendations. https://i.redd.it/i5rvr5ad9ug21.jpg


I have a couple of kids 8 and 5 and I do let them upload videos but I check them first and if there was anything that I thought a pervert might like I edit out first.

There is no need for YouTube to have any video of my children a pervert might want to watch.


The issue isn't so much the content of the videos; for the most part, they could be altogether innocent and within the guidelines that YouTube, and the law, requires. The issue is the community that has developed around them.

> Ban kids uploading a video of their innocent pool party because some creeps might enjoy it? What about topless babies?

The children are unable to agree to the terms and conditions set forth by YouTube, and sadly, there is a possibility that their parent or guardian is aware of the exploitative nature of the community.

So, perhaps yes?


How is anyone being exploited here exactly?

> The children are unable to agree to the terms and conditions set forth by YouTube

Which means that children should not be able to upload any kind of video to youtube, but it should be fine if they upload it to say an ftp server for public domain videos that does not impose any terms and conditions, right?


Advertisement dollars are being accrued via the attention of pedophiles. That's clearly exploitative.


> Ban kids uploading a video of their innocent pool party because some creeps might enjoy it?

That'd likely be seen as an overreach now, but twenty years from now we may wish we'd done that.

In the meantime, making them only visible to an approved list of friends/family seems like a reasonable step for video of or published by minors. Maybe you put together a "child actor" exemption where they parents can moderate it for public access.


YouTube should just disable comments on all videos


I agree. They provide nothing constructive.


Unless you're unaware of whether it is, in fact, 2019. YouTube should just delete every comment with the current year in it.


It's not the videos, it's the comments, and the suggestions pushing the commentors to other similar videos with similar comments.

Comments like, from "predator133," "@2:33 is what we call came here for," linking to a part of a video where a child is doing the splits or something.


That doesn't sound like something anyone who wanted to watch child porn would post, right? You want to watch your porn in peace, not get the video taken down and your account tracked by Chris Hansen and the FBI. On a private pedophile forum, sure. Not on YouTube.

It sounds more like something someone who wasn't a pedophile would post, because they were trying to get the video taken down or to bait real pedophiles into posting agreement.


Well, we're both working with the same evidence, so, I disagree. I think it's more likely that it's a pedophile, that they're not willing to risk hopping on the dark net, that videos of children on youtube are enough to sexually gratify them without fear of retribution.


Someone who's not willing to risk hopping on the dark net is smart. Too smart to assume that most people who see a comment from 'predator133' on YouTube will approve of predators.

There are neo-Nazi parades of people not afraid of being hated, but there are not, like, "I cheat on my taxes" parades. Too much to lose.

And there are a lot of hoaxters, trolls, parodists, and false flaggers out there. Often more of them than the deviants, because the deviants are weird and underground.


I don't really know what @2:33 is about, but let me take a guess and ask you: would you let actual children see the contents that appear @2:33? And if they did, would they be horrified? If you asked them to describe what was happening on @2:33, would they feel unfomfortable? Or would they give some innocent, mundane description?

This is a great test to see if something is dirty or if it's only dirty in your mind. If you have to create an imaginary pedophile in your mind that gets aroused by @2:33, and a whole group of them that are searching Youtube comments to find the timestamp for "what they're looking for", you have to wonder if you're not using some huge leap in logic to arrive to conclusions.

Ironically, what this video has shown is that Youtube's system works, as this guy, to make his video, had to go for what seems to be hours of research and after it, he shows the most shocking videos of little girls he was able to find. And there wasn't any nudity, or anything actually implicitly sexual shown. Everything was clean and every timestamp led to family-friendly content.

The "worst" thing I saw was the usage of the water emoji, and I'm like, really? People use the water emoji and suddenly you can be very detailed about their sexual orientation and why they timestamp videos or share them? All what is happening on these completely innocent videos could be explained in many ways, thinking the most likely explanation is "that's a pedophile" means the imaginary pedophile in your mind needs to be toned down.


Sure, I'd love a world where "the predator" links a timestamp to a child doing a yoga pose with the comment "all what we want here" is a fellow child supporting the kid on her yoga moves. I'd love that. I don't believe it's true.

Why do you believe the innocent angle is more likely? Do you know people that make a habit of linking timestamps of children doing yoga poses?


A lot of the videos he shows seem like they're mostly Ukrainian, some Eastern European or Russian. Are there different social standards where parents there just don't care as much if their kids post this stuff?


I don't think Youtube can or should do much of anything here, but why on earth as a parent would you let your child upload something like that to a public website (or even worse, upload it yourself)?

There are a ton of weirdo creeps out there and putting your child's private videos on Youtube of all places seems like terrible parenting.


I actually think it might be technical naivety? Eg families know that you can share videos on YouTube, so they want to share their family videos to other friends and family, and don't understand permission settings and other such intricacies.

Like most people here likely do, I run tech support for my entire extended family, and a very very common theme is trying to share photos/video with family and friends.

Apple locks down iCloud so it's only easy to share with other iDevice users, and YouTube makes it REALLY easy to initially upload a video (drag onto this box, or just share from your phone/tablet video browser) but then doesn't clearly explain the different options in the processing screen, etc etc.

They also don't understand the repurcussions. They're not on reddit reading about Elsagate, they're not aware of the deep seedy underbelly of the Interwebs outside of "people buy drugs on the darknet" because they saw it on the 7:30 Report. They think the Internet is Facebook, Google, YouTube, and Amazon.

It's extremely easy for technically proficient people to underestimate how complicated and overwhelming these "simple" (for us) tasks can be to non-techies.


I've never uploaded a Youtube video but wouldn't the parents or the kids see these comments? Don't you specifically have to turn on monetization?


Comments are on by default, and when you upload a video it gives you a "share this URL" thing after, so there's a good chance the families never actually notice the comments. If they're watching on mobile too, then the comments aren't obvious unless you scroll down below the video.

I know people in my family that didn't even know YouTube had comments. They watch YouTube on a smart TV and just use the built in search tool, and all videos play fullscreen. This doesn't fit with the above family sharing thing but it shows how little lots of people understand YouTube in general.


Actually instilling morals and enforcing boundaries as a parent is an unpopular opinion these days.


It is funny though, because it is them who make the association... I would have never looked at it that way if they wouldn't have told me to.


> Ban kids

Yes.


You want to ban videos of children from the internet because pedophiles might be aroused by them?


i mean, another angle is the parents that are posting videos of their kids online are not asking their kids for consent first, and the kids may be themselves unable to give consent. requiring videos of children to be unlisted seems completely reasonable to me.


Let's be clear here, many if not most of these videos are being posted by the kids themselves. Everyone wants to be the next youtube star. There are enough gymnastics youtube stars that other kids try to emulate them.


How about we turn it up a notch and also take a stab at malnourishment and overpopulation? A young healthy child well nursed, is, at a year old, a most delicious nourishing and wholesome food


This was an allusion to A Modest Proposal, you uncultured fools!


The problem is not the videos being there, its the advertisments next to them.

Youtube simply needs to move to a proper gatekeeping model around monetisation, requiring human review and biannual review checks. Start it at 10,000 subscribers to avoid being overwhelmed. Prompt once-off approval for 'viral' trending videos from new channels (even if the advertising money goes completely to Youtube).

The internet is maturing and existing gatekeeping models are too lax. Same thing with games allowed onto Steam. Now any idiot with a phone can upload something - previously you needed a computer and decent knowledge to do that.

Limiting new uploads from new accounts to 720p30 max until they hit 1,000 subs or pay a $100 'starter fee' will save on storage and processing fees.

I don't understand why Youtube (and other tech platforms) sets the barriers to entry so low, then inundate themselves with work. Simply raise the barriers until your human-approval processes can handle the volume.


What does monetization have to do with any of this, except as a hook to attack youtube and bait advertisers to respond? Monetization is entirely irrelevant to the problematic issues at play.


All of the means to attack Youtube have been centred on questionable video content being supported with ads from mainstream companies.

Youtube should have wisened up and become a lot more selective of which videos were supplied with ads.

And if running slightly fewer ads is too commecially difficult, then save costs by limiting new uploads to 720p.


and really if you don't intent to have ads run on your content, you shouldn't use Youtube. Setup a Peertube instance. YouTube is just too big and content creators that are up and coming should see if they can take control over their content instead of depending on Google, which is just a new version of NBC/CBS/ABC at this point.


I think EU upload filter are a perfect solution!


How is that a solution?


>I only saw harmless videos of kids having fun when I looked at his own examples.

Have you seen this -- https://youtu.be/M78rlxEMBxk?t=364 ?


My issue is that whenever the topic of "child exploitation" comes up, there's always a lack of specifics. Both the specifics of what we're actually referring to and the specifics of an actual solution.

Instead, everyone (like every media outlet and internet comment) casts a vote against "child exploitation". Who wouldn't? And the discussion stops there. "Whatever they're doing, it needs to stop!"

For example, I bet most people aren't thinking "whispering girl" or "family video of daughters playing in the pool" when they hear the phrase "child exploitation". And maybe most people think Youtube is no place for those videos, fair enough. But the taboo subject and the lack of specifics creates an incredibly charged and blind environment that makes it hard to have a real discussion.

So let's be specific here: what kind of action should be taken against this whispering girl's video and what kind of policy does Youtube need to have at scale?


Lets be specific. My first question -- so, 12 y.o. girl being dressed in police uniform and acting like in an opening of a cheesy porn movie is your definition of "just kids having fun"?


I haven't suggested a definition of where to draw the line, it's exactly what I'm asking.

Is that video sexual content? And how would you suggest encoding it in a policy change? Obviously, "I'll know it when I see it" doesn't scale.


Why must it scale? Laws don't have to scale.

That the solution must scale algorithmically is purely for the convenience of the site, and for matters of law and taste nothing but an excuse.

Even having different countries with different laws and standards doesn't scale. So what?


Laws have to be objective.


Yet for matters of taste, public decency, movie censorship, obscenity etc laws necessarily depend on "I'll know it when I see it" to some extent. Or you end up with absurd lists of approved and banned words, poses, bits, and "you may only reveal upto 5" above...". Objective but useless. In Florida public nudity is legal unless "vulgar or indecent". Objective how?

Then we inject judges and juries into the process to make them even more subjective.


That's exactly why those are all examples of bad laws.


So how would you propose legislating them? Or are you suggesting that there should be no limit at all on what is available or permissible, or of age? Case law and precedent generally makes it clear enough, and adapts to changing expectations without needing to be rewritten. It may adapt slower than many would like. Doesn't make the laws bad.

Every country has some limits, and standards for movie ratings and age restrictions etc.


I propose that there should be no limits on what's available or permissible that cannot be objectively defined. You can ban alcohol for people under 21, for example, but not for "people that I know when I see them shouldn't have alcohol". If you want to ban violence, you have to be specific about what this means (e.g. "no blood", "no slaying", "no physical struggle" etc).

If some legislation cannot be so rewritten, it should be thrown out. Yes, that goes for anything with "vulgar" or "indecent" in the title.

Speaking of standards for movie ratings - they aren't law in US, just industry guidelines. So they're subjective, but people have a choice of following them (e.g. because they believe the board that rates stuff has the same subjective opinion of what is "inappropriate" as they do) or not. Similarly, movie makers can, and do, release unrated movies. Since it's all voluntary, there's no problem here.


I don't have an answer.

But what I can tell is that video I linked above... I'll give 99% probability easily, that this video was made for purpose, with sexual context in mind, most probably by some adult person, not another kid, and it is not "just kid having fun", and it's not "family video".

This is softcore erotica with 12-14 y.o. girl, and it was made to be a softcore erotica. There is literally zero other reasons to shoot video like this.


If you don't know anything about ASMR you might think that. But that's pretty standard content in the ASMR-sphere. The only difference is that it's being made by a preteen.


>If you don't know anything about porn you might think that. But that's pretty standard content in the porn-sphere. The only difference is that it's being made by a preteen.

is also true.


Some loose similarities does not establish the sexual nature of these kind of videos.


>loose similarities

Without sex or nudity, how do you purpose you'd make it more similar?


I'm not sure what you're asking. The argument was attempting to use a similarity to establish the pornographic nature of the content. My point was that a loose similarity doesn't prove his claim.

Establishing a criteria for pornography is hard, but the usual metric seems relevant: designed to sexually arouse and otherwise devoid of value. But roleplay ASMR content presumably isn't designed to sexually arouse, and has other value for those who experience a physiological ASMR response from it.


I can't think of a single element to add that would make it more similar to pornography without it being actual pornography. I believe that it is as similar as is possible, and getting so close cannot be accidental. I assert that if you can think of something that would make it more similar your premise that the similarity is loose may hold water, if not however it would seem that I'm correct and the similarly is uncanny.


>I can't think of a single element to add

How about explicit language?

>and getting so close cannot be accidental

Probably because sexual arousal and ASMR are both physiological responses.


>How about explicit language?

Can you point to another piece of media, say something on the Disney Channel, where adding a few curse words would turn that production from innocent children's entertainment into soft core pornography? If not why is that a unique feature to this particular video?


Somehow I knew you weren't actually going to accept your own stated criteria.


I'm just pointing out that you are engaging in special pleading[0] here. I've seen pornographic videos that did not have explicit language prior to the sexual acts taking place. In fact most of them are like that during the brief interludes of plot that justify the sex.

Let me put it this was if I was an evil person and my goal was to create a legal facsimile of child pornography I don't know that I could do better than this video did at that stated goal. Perhaps legally it doesn't cross a line but can you honestly say that you'd leave your daughter in the care of the person who made this video to do the same?

[0]https://en.wikipedia.org/wiki/Special_pleading


I don't see how anything I said was special pleading. You set up a challenge that purported to test whether this content was pornographic. I answered the challenge. It was a bad criteria from the start (as in the answer to the challenge was irrelevant), but I played along.

> I don't know that I could do better than this video did at that stated goal.

What is this supposed to establish? That creeps can get off on this video? Creeps get off on mundane things all the time. It establishes nothing.

>Perhaps legally it doesn't cross a line but can you honestly say that you'd leave your daughter in the care of the person who made this video to do the same?

Since the artists in the video are generally those creating it, sure.

Presumably you mean would I leave my daughter in the care of someone who enjoyed such videos. If I knew them, sure. If I didn't know them, it wouldn't really add much information. It would slightly alter my prior against them, but that's not saying much. In the space of non-pornographic things that creeps would get off on, this video surely ranks higher than average. But this isn't an indictment of the content.


[flagged]


Spoken like someone who doesn't experience the physiological ASMR response. As someone who does experience it, you're obviously speaking nonsense.


Even if this supposed physiological response exists, that doesn't make the content any less fundementally pornographic.


Now you're just resorting to begging the question.


But are you sure that no one on this planet would consider it as "kids having fun"? YT is a global medium. Just because you (or even the majority of people) are offended, it doesn't mean that everyone should be.


A 12 year old girl, whispering in a sexual manner, wearing a stripper style police outfit, saying things like "I need to leave for my tinder date tonight" and "this is never happening" while pointing at the camera and herself. (and that's just the first 30 seconds)

I'm sorry If you think that's "kids having fun" you belong on a cross.

Edit (more details on exactly why this is wrong): It's shot in the style of POV porn videos that are popular today, the driver is male (why not female if the purpose is not sexual), his name is used infrequently as possible (don't want to ruin the immersion of the viewers), she holds up handcuffs and suggests that she "might have to use them on you", the outfit either doesn't fit or is purposefully open in her chest area, she is wearing exaggerated makeup and has press on nails in a manner fitting of a central casting prostitute.

It's about as extreme as you can get without the FBI coming knocking on your door. I'm a atheist and I'm praying that someone, somewhere in law enforcement is keeping a close eye on her. Poor girl.


So is the girl in this video .. deciding to make this herself? Like she saw other examples of this, with adults, and then decided to do this as a child, was somehow a good idea? In that linked video the guy reacting points out the terrible Tinder joke. So therefore, she's aware to some extend how fucked up this is right? ...

I realize some of these kids make tons from ads on these videos .. but at some point ... that's just bad parenting. Like what are you teaching your child?


An eleven year old shouldn't be uploading content to YouTube. An eleven year old doesn't understand the second or third order consequences of their actions when they upload a video of themselves, along with their friends, trying on bikinis in their bedrooms.

If you're under 16, your content, and the uploading of it, should be supervised. Why are we letting our children run free on the Internet? You wouldn't let them run free on a highway and the Internet is just as dangerous but in different ways.

I believe YouTube need to implement a system of age verification for content creators. 16+ minimum if you ask me. If the parents do the uploading, whelp, you found your issue (if any arise)


>You wouldn't let them run free on a highway and the Internet is just as dangerous but in different ways.

Kids here roam across the city unsupervised by age 7-8. This includes highways. Also, correct me if I'm wrong, but doesn't YouTube require you to be at least 13?


> Also, correct me if I'm wrong, but doesn't YouTube require you to be at least 13?

You need to be over 13 (or above, depending on the country) to create an account. There's no limit on the content itself.

In my opinion, there is no justifiable use case in which videos of children under 13 should be made publicly-available, but YouTube itself disagrees. Case in point: Ryan ToysReview has 18 million subscribers. YouTube thinks it's okay because the videos are uploaded and edited by his parents, while I think it's absurd that anyone makes an income based on sharing videos of their children publicly.


> YouTube thinks it's okay because the videos are uploaded and edited by his parents, while I think it's absurd that anyone makes an income based on sharing videos of their children publicly.

Movies and TV shows starting children are not unheard of as commercial endeavors. I'm not sure why with YouTube as a viable monetizable video platform, parent-driven productions on that platform are fundamentally any worse of an idea.


Yes, movies and TV shows with children under 13 are widely-accepted, but being widely-accepted doesn't mean it's morally alright.

There might be a time in which we reject such content just as we did with prior widely-accepted behavior that was immoral (like homophobia and racism). Future generations might look down on us because of child celebrities (among other things), and I sincerely hope they do.


> Yes, movies and TV shows with children under 13 are widely-accepted, but being widely-accepted doesn't mean it's morally alright.

You’ve presented no argument as to why it's not alright. Now, this may be a base moral axiom for you, such that no argument is needed, but that's not particularly convincing to anyone who doesn't already take it as a moral axiom.

> There might be a time in which we reject such content just as we did with prior widely-accepted behavior that was immoral (like homophobia and racism).

There might be a time when we see anything in any way you can imagine. That's not an argument that we should.

There might be a time


You want to destroy Leon, Star Wars and Aliens?


This just seems ridiculous. I post a lot of videos of my daughter to share with my family and friends, and generally don't care who looks at them any more than I care who looks at her (or what they are thinking) when we are in public. I've gotten a couple creepy comments here and there from random people who found them. (the comments weren't overtly sexual though, but I deleted them anyway) Here's an example of a harmless one that got such comments (note that my daughter is the one in the jeans, I tried to be careful when shooting it regarding the girl in the skirt): https://www.youtube.com/watch?v=1p21NMw54Y0

I also notice that video has 10 times the number of views as the rest of the videos of her, presumably because of the title.

I had a couple videos from when she was younger and wearing diapers and a shirt when she was wading in the bay, and got a comment from someone who wanted to know what brand of diapers she was wearing. (wtf?) I deleted the comment of course, but a while later I see that YouTube has deleted the video for "inappropriate content" (double wtf). I mean, she was more covered up than if she was wearing a two piece bathing suit.

The thing is, even if someone is a perv and gets off on this sort of thing, it isn't endangering her. I'm not going to lose sleep over it. If I am really that worried about protecting her from pedophiles, I probably shouldn't take her anywhere in public, where someone could follow us home or otherwise directly cause her harm. I'm not going to live my life in that kind of fear.


While certainly child predators are an issue, I don't even need to consider this sort of thing from that angle.

Why do you need to post publicly rather than marking it as unlisted and sharing the link with people you know and trust? Would you personally like to have a video posted of you in your underwear? Even if that's okay with you, perhaps your daughter will grow up preferring to have a more private life, and to keep things like that off the internet? But you've taken that choice away from her.

One of my favorite articles on this topic: https://slate.com/technology/2013/09/facebook-privacy-and-ki... (and a follow-up, after the author was informed that she had unwittingly allowed some photos of her daughter out onto the internet: https://slate.com/technology/2013/09/privacy-facebook-kids-d...). I don't know that I'd go as far as she has, creating a sort of "digital account trust", but I do like the approach of keeping a kid's private life off the internet until they're old enough to decide how much they want to share.


> While certainly child predators are an issue

In what sense "child predators" are an issue? Because they leave disgusting comments?

> perhaps your daughter will grow up preferring to have a more private life

... or perhaps she will grow up preferring to get as much publicity as possible. But if her parents do not publish that video on YouTube - she would lose that opportunity.

In any case, this publishing decision is in hands of her parents, and clearly, parents made up their mind (and decided to publish that video).


As far as underwear goes, no I wouldn't care if I was under age 3 in the photo. Seeing a kid in a diaper is an everyday public thing. In that case, she was in a diaper rather than a bathing suit because she was still diaper age. I don't see it remotely as something to be embarrassed about.

I don't "need" to post publicly but I also have no reason not to. I'm not going to let people having creepy thoughts control my life, unless there is an actual, tangible danger of harm.

Honestly, if I was worried about creeps, I wouldn't let her wear a skirt in public. I see other girls at the playground all the time that make me wince a bit...I'm far more likely to dress my daughter in pants. But at the end of the day, I'm 1000 times more concerned about people in real life than on YouTube....because they are right there.


Have you considered that your child might really regret all this stuff you're publishing about them when they're older?

I don't exactly care myself but I do appreciate that my kid isn't capable of consenting to having his childhood published. So I don't do it.

It seems really odd that you're reason for making that call is, "meh, why not?"


It's more like having considered it, I have no idea why they would care. A picture of me as a kid is essentially a picture of some other person who is five years old. When I see a picture of some five-year-old kid I don't think "Wow, they might feel terrible if they knew I could see this."

Maybe the sleepy baby girl from https://www.youtube.com/watch?v=KTCQpjUrCe8&index=147&list=L... will grow up to be really shy and feel that wasn't her best angle. That strikes me as a personal hangup, not some constitutional right we should all be protecting her from.

It's funny how heated I can get about the notion "there are people whose reaction _isn't_ 'meh, why not?' What's wrong with them?" I'm like a militant meh-why-notter.


Militant meh-why-notter. Can I use that? :)

(thanks for your support, btw...)


What exactly is "all this stuff"? Photos of her playing?

I think about things she might actually be concerned about, but I see no reason she should be concerned about it, so no I'm not going to worry about it. If I was concerned, it might be about "should I put her in a dress or have her wear pants?" Because if she is wearing a skirt, someone could see her underwear, god forbid. YouTube or not.

At least on YouTube, it's pretty hard to follow someone home like can happen in the real world (e.g. Jayme Closs).

You know what I spend my efforts worrying about? Her getting hit by a car. Drowning. You know, real things.


> You know what I spend my efforts worrying about? Her getting hit by a car. Drowning. You know, real things.

Privacy is a real thing, and worrying about it isn't mutually exclusive with any other concerns. I'll be honest and say I'm somewhat disturbed by your whataboutism and nonchalant deflection of the fact you're pervasively invading your child's privacy and posting it online for all to see, consent be damned. Yes, you "see no reason" to be concerned, and that's your choice to make as a parent, but at the same time it's an extreme stance you've taken when, as other commentators have pointed out, the safest and most logical thing is to be more conservative with sharing such information.


"Privacy is a real thing, and worrying about it isn't mutually exclusive with any other concerns"

Sure, but you need to balance things. Having someone on the internet see a video of her, fully dressed, and getting aroused by it, doesn't exactly compare to any number of real issues. As I said elsewhere, if I actually was concerned about her privacy so much, I'd go to a bit more effort to avoid having her wear a skirt or dress when she is playing. That seems like a good first thing to worry about.

Although not many people see the videos I post of her, she was thrilled when a nearby YouTuber (a 7 year old girl who actually has a significant number of subscribers) found her via this video and we'll hopefully get together for skateboarding together. So there are positives about putting them out there. https://www.youtube.com/watch?v=KuIEW9MCmcg

Meanwhile, I've dealt with my daughter's mother for years, as she freaks out over the most ridiculous tiny irrational fears, while nearly killing our daughter due to her recklessness in other ways.

You may think I have an extreme view, but you know who has to be concerned about being seen as having extreme views on things like this? People who are in contested custody situations, where their ex hires aggressive, unethical lawyers willing to grasp at any straw to tear down their parenting in a court of law. All the more so if that person represents themselves rather than having an attorney.

And that's actually me.

I assure you, if posting YouTube videos of my daughter playing at playground was something they thought they could use against me, they would have. But they didn't. And I won about the strongest decision you're going to find, and was awarded primary custody.

So yeah, I reject your premise that I have extreme views on this issue.


There are no objective solutions to this kind of "problem". You think is disturbing, other think its not. Whatever solution you purpose, someone will be happy, other will be unhappy.


Well, kind of. One could argue that you can't un ring a bell. Your child can always publish all their childhood stuff if they decide to later on.


> As far as underwear goes, no I wouldn't care if I was under age 3 in the photo.

Clarification, though: I was talking about you-the-adult, not some hypothetical video of you from when you were 3. Maybe you would be ok with that. That's fine. You're an adult and can decide for yourself what you want to post online about yourself. (I personally do not want videos of my 37-year-old self in my underwear on the internet, but that's just me.)

Anyway, as I initially said, I'm not even talking about creeps here. I, personally, as an adult, would not want photos/videos of me as a child floating around on the internet. It's not a matter of embarrassment or safety, it's just that I don't want my life to be public to that degree.

But hey, you can always argue that I'm an adult, and I didn't grow up with always-on internet connectivity (or, really, much in the way of internet connectivity at all until I was in high school), so maybe my opinion doesn't count, because culture and social expectations are different now.

But... they're not, really, at least not universally so. There are people who are children right now who are uncomfortable with this[0]. The linked article also mentions, for balance's sake, some kids who enjoy having a ton of information about them online. But that's the thing: you can always put more information about yourself online, but once it's there, it's very hard (sometimes impossible) to take it back. Parents who share things about their kids online rob them of the agency to decide for themselves how private they want to be. That's just fact, regardless of what you've decided.

[0] https://www.theatlantic.com/technology/archive/2019/02/when-...


> Why do you need to post publicly rather than marking it as unlisted and sharing the link with people you know and trust?

Not always realistic. If you record some boyscouts event, you probably don't know the name of everyone there, let alone account information. People often email a link, and then people share it by forwarding it.

Alternatively, of course, you could upload it to some website for your organization, but again it'd generally need to be public, or you have the usual IT horror of getting everyone an account and dealing with people who say they can't log in. There is a reason people just toss it on YouTube.


unlisted links shared via email are still able to be opened by anyone with a link. They don't show up in recommendations and search results, but they're not private/whitelisted.


> Why do you need to post publicly rather than marking it as unlisted and sharing the link with people you know and trust?

Important point, still kind of weird to me, even in this day and age of Facebook and social media in general, where everyone feels the need to share every intimate detail of their lives with strangers.

---

http://pleaserobme.com


I personally think you're missing the point of the article.

The content of your videos (based on the example you gave) are harmless. Adorable, in fact. You have a beautiful daughter - congrats. Sadly you're missing the point that you did the uploading, not her.

The article is about children, I'd say around 11-14 in age on average, possibly late teens too, uploading videos of themselves... themselves. And the videos are adorable videos of them down the park in jeans, they're videos of them in bikinis in their bedrooms. Some of them are videos of them sucking on lollipops... I mean really?

Also, the videos being uploading are flags. They're used to signal to pedophiles: there's other content available outside of this video. They're guiding lights for people who wanted darker content.

Your videos are just your videos, and you moderate them. It's not the same thing at all.


> You have a beautiful daughter - congrats.

Creep!!! Just kidding.... Thanks, she's the best. :)

Point taken. I get a bit defensive here and there, and I guess what I'm more defensive about is reflected more here in the comments, where people are so quick to shame parents for not being so worried about things that I think are non-issues. (note that I am a bit of an outlier at Hacker News, in that I am in general not particularly concerned about privacy...in the sense that I don't worry about, say, Amazon following me around the web because they have figured out that I am in the market for digital pianos and skateboard parts)


> The thing is, even if someone is a perv and gets off on this sort of thing, it isn't endangering her. I'm not going to lose sleep over it.

But... it's a video of her, not of you. Is she going to lose sleep over it if in five years she goes on YouTube and discovers someone has cloned and monetized the video, and a bunch of creeps have commented on it?


Well I'm her parent and I get to make such decisions for now. That's how it works. If I had any concern whatsoever that she might be embarrassed about something I post, I'd take that into consideration, but I'm not.

Someone could video her at a playground, and post it on YouTube. Should I not take her to the playground out of fear of that?

Honestly, I think I'm probably more concerned than most other parents, since they put their daughters in skirts and let them play on the monkeybars, while I tend to dress her in pants when she is going to be doing that sort of thing. But it's probably more out of concern for other dads, who (thanks to all the hand wringing about this sort of thing) tend to feel like everyone is on the lookout for someone who is creepy.


Can I ask why you would make those videos public, instead of private and sharing them with just people you know?


Because a few people subscribe (family and friends), and I put them in a playlist, and so on.

I mean, I don't care. She's just playing at the playground. I didn't think about it at the time, I guess I could, but I don't see why it matters. If I was concerned about anything, it would be about creepy people seeing her in the real world.


Based on that use case, you would probably be better of using a product like Google Photos instead of Youtube, since the latter is more for publicly broadcasting. Just my 2 cents.


Google photos is awful. I do use it (in fact that is where my videos go by default), but sharing stuff isn't a good experience.

If I was worried about the public seeing stuff, I'd just make things unlisted, and sometimes I do.


After reading all of your comments hear, the only image I can form of you is a a guy with his hands over his ears yelling "I CAN'T HEAR YOU!" You need to wise up. Before you read this article, you were naïve like many parents, but now you're just being wilfully ignorant by continuing to defend these videos being public.


> The thing is, even if someone is a perv and gets off on this sort of thing, it isn't endangering her.

Careful with that idea, it's some pretty enablist shit, and you most frequently hear it coming from the perverts themselves.


What matters is if it is wrong or not, not who says it.


Yes, hence the "is enablist shit"...

And no, being lumped in with pedophiles matters. I am trying to warn him who he sounds like, not making a rational argument against his position. It's up to him to decide if he wants to sound like a pedophile.


Well I'm talking about how I prioritize decisions about protecting my own daughter from harm from others, not about whether it is ok for me to look at / think about other people's daughters.

Given that, if I sound like a pedophile, to me that says more about the listener than the speaker.


Probably, but the listener may also be a judge, or the parents of the kids your daughter plays with, or someone else who has a material effect on the quality of your life.


Yup, spent my time in front of judges. Self represented, up against nasty lawyers, solid week long trial. If this was a thing that could be used against me, I assure you, it would have.

It wasn't, and I was awarded primary custody.


Okay cool, how can I get in touch with your ex then and let her know about this conversation?


Not gonna post it here, but contact me rjbrown at gmail and I'll give you her email and her lawyer's email. Knock yourself out.


It's weird how far you're okay with going here, honestly this makes me concerned there's an issue more than your original comment.

All I said was "be careful" and now you're giving me your ex's lawyer's contact info. What a weird escalation!

This can only go badly for you, I hope you realize...


Youtube wants to cater to everyone but to me it is clear Youtube is a risky place for a child. I let my kids use the site but I have no expectations of anything less then we are seeing here and try to keep an eye from behind. But I myself have found these types of wormholes he talked about in the video. Searching for massage techniques and saw a pretty girl so clicked it then next thing I know I am down a worm hole of naked women being massaged on Youtube. Another worm hole is the one pointed out by another youtuber about breastfeeding videos. You search for breastfeeding and you will see these creepy videos pretending to be educational but clearly are just a video of a woman showing her breasts and a child feeding. Then came across videos that seem to be educational but really it was just a large breasted woman standing naked and the video described all her lady parts with close up shots. So to me clearly youtube has all sorts of crazy things from how to have sex to how to build weapons. I am not kidding myself that it is a kid hostile environment. I do find that they have do many good videos for kids so allow my kids on the platform. I just do my best to keep an eye what is going on.


> Youtube wants to cater to everyone but to me it is clear Youtube is a risky place for a child.

The Internet, in general, is a "risky" place for a child, always has been and always will be. But trying to "child-proof" this whole thing will just completely destroy what little remains of its original idea. Doesn't YouTube by now have curated categories with specific themes? Imho it shouldn't be that difficult for them to offer something like that.

And yes, YouTube is full of weird semi-soft-core content, which I consider quite amusing and innocent in its own way. It's kind of cute to think there are so many users on there who seem to be completely incapable of finding actual porn on the Internet, and instead need to use YouTube as the digital equivalent of a Gloria's Secret underwear catalog.


Why would you assume that semi-soft-core material appeals only to people who can't find actual (presumably hardcore) pornography?


Didn't you yourself grow up in such a "hostile" embossment l environment though? I saw all kinds of crazy things online as a kid. It taught me what the real world can be like, but I still had the safety of being able to turn it off and walking away. I think if my parents had supervised me at the time I would've been poorer for it. For example, shock sites like goatse taught me to be suspicious of what links I click on. This eventually ended up with me learning about phishing.


What I don't get is why the fringe community even exists - is there that large of a market of people with porn websites blocked, but youtube not blocked?


Consider that "softcore" porn doesn't exist just because of restrictions, it also serves a market for a certain set of tastes.

I can see someone preferring to watch two attractive women massage each other on YouTube over the content found on porn sites.


People at work?

Other than that, honestly, the only population I can come up with who would be watching these on youtube instead of an actual porn site would be kids.


What would stop kids from going to a porn site anyway? The question whether you're 18+?


School / company network with content blocker?


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: