I looked at some of the videos that appeared in his report and it's basically family videos of young girls, the same kind of videos my own sisters would shoot with our dad's camcorder. His first example are videos that come up when you search https://www.youtube.com/results?search_query=orbeez+bath.
I'm sure these videos do attract some weirdos in the comments, and in the next paragraph Youtube says that they want to moderate that activity, but what else do you do here? Ban kids uploading a video of their innocent pool party because some creeps might enjoy it?
The youtuber's video includes the provocative phrase "Sexual Exploitation of Children", but I only saw harmless videos of kids having fun when I looked at his own examples. He also focuses a lot on the comments themselves which is a much different argument.
Your family camcorder videos probably weren't made available for checkout at the public library, filed under "young girls doing gymnastics" in the library's card catalog. If they were, I imagine your family and many members of the public would be creeped out or upset.
> but what else do you do here?
Require (or maybe automatically force, if implemented well) those videos to be set to Visiblity: Unlisted. I don't think anyone would be upset with sharing these links with family.
After working in tech for long enough, you realize that no one reads anything, even if its absolutely clear in the UI. And they will blame you for it.
I always have a hard time explaining to people carefully writing pagewalls that throwing out 90% of it would actually be more effective...
At the end it boils down to the truth that a Corporation is a hive mind, an intelligent machine ruthlessly optimizing for stock price and profits at the expense of sometimes human values.
What's not immediately clear is that the video (and channel) now shows up in search results. You won't be aware of that unless you either have a clear mental model of the platform, or try to search for the video title yourself (which implies a certain sort of mental model to be present already).
Sure, people only skim and headline-read articles, but because most articles are shit. That doesn't mean people don't ponder at least a bit when faced with a YT upload page. (And the upload through mobile app is probably more common for non-pro [first time, casual, accidental(!?)] youtubers, there the privacy: public field is pretty visible, legible and explicit.
They can mean anything in people's mind, public is that yeah anyone I send the link can see it, but that's not exactly YT's fault, that people don't make the connection, and somehow live in a fantasy world.
And I cannot blame them too much. People are constantly bombarded with misleading messages (e.g. ads), clickbait stuff, terrible UIs, indecipherable contracts, countless pop-up notification and warnings, opt-out and other "dark patterns".
(disclaimer: personally, I read contracts and agreements and it's often pointless and frustrating)
Due to various cognitive biases most people fail to realize just how big and far reaching that public is.
To share an example from my own experience a few years ago. I wrote a blogpost. I love writing blogs and sharing them with The Public. That's why I write it.
In one of them I used an Xkcd comic. My entire audience, anyone I could ever imagine reading it, knows XKCD and loved it.
But then that blog got 100,000 views. Then 200,000. Then it spiraled. Suddenly I was flamed from all sides, called a disgusting human being and all sorts of things. Why? Because I didn't correctly attribute that XKCD comic.
"Yay my friends and their friends and liek 1000 people saw my thing" is a completely different "public" than when THE public sees your thing. Most people imagine the former when they think "Oh I'm uploading this and it's going to be public"
I uploaded my children's dance recital video, as an unlisted video, and earned myself a copyright strike because of the background music.
1) Purpose: to illustrate the efforts of my young children in a non-professional context, not to showcase the audio portion of the recording.
2) Nature: audio + video, with explicit permission to record from the show executive/directors (it was a rehearsal, professionals recorded the actual show) and focus on the visual and physical nature of the unique performance
3) amount: abridged songs, amateurishly cut and remixed
4) effect on potential market: There is no way this is a substitute for any of the original copyrighted audio, and I communicate this intent to youtube by requesting it remain unlisted.
It probably does pass the four factor fair use test, but fighting with the youtube contentid machine is futile, and their system doesn't care about fair use.
- the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
Which is non-commercial.
- the effect of the use upon the potential market for or value of the copyrighted work.
Which is to none.
There's also a question of if a video containing a bad transfer of a piece of music along with children dancing is "transformative." See Lenz v. Universal Music Corp, where Universal Music got destroyed all the way up to SCOTUS for ignoring Fair Usage when taking down a video of a dancing baby.
Your whole argument seems to be based on some fictional contractual agreement between the dance studio and the music's owners. That isn't relevant to the discussion at all, a parent recorded and distributed, not the dance studio themselves for commercial purposes.
Which seems very close to crosslinking comments with timestamps on fully public videos, with advertising.
But, I mean, this takes away the use case of Youtube as a replacement for America's funniest home videos.
Not that that's a necessary thing for societty, but it is something to think about. The chances that some creep found AFV stimulating is non-zero and advertisers paid that show to have their ads. The only difference here is the openness of the comments, whereas those AFV creeps probably had to communicate in secret.
The whole thing is creepy and awful, but I'm not sure what the best solution is. Perhaps banning comments on those videos.
Not attempt to monetize your family videos. Disable ads on your own videos.
The videos themselves aren't really the issue. It's the context they're presented in.
It doesn't solve the "pedos are looking at my kid" side of things.
It's possible to have family videos private on YouTube and only share them with a small circle of people.
That’s categorically false. Every iCloud user has their own photo library and you cannot create a collaborative one. So all my photos are in mine, all my wife’s are in hers, and we’ve resorted to uploading to a a joint (also paid) Google photos account for sharing with each other and family.
There's a quality downgrade so it's not suitable for sharing archive-quality photos, but for us it's adequately replaced Facebook for sharing photos and videos of our kid, vacations etc
Yes, the videos are kids being themselves, and typically uploaded by the children. However, then these predators show up pretending to be other children, and they talk these kids into doing more explicit and inappropriate things. They ask children to film a video sucking on a lollipop, they ask them to make a video showing how to do the splits, or they ask them to make a video playing twister with their sister. I even saw them asking children to do the "toothpaste challenge", where they try to get them to fill their mouth with toothpaste so it looks like cum. If I start typing "toothpaste challenge" YouTube autocompletes "toothpaste challenge drool", "toothpaste challenge tongue", "toothpaste challenge little girl", "toothpaste challenge girls".
These kids don't know any better. They make 50 videos that all have 50 views, then they make a yoga video and it gets recommended over and over again to predators, and they end up with one million views. Then the above situation unfolds where they're requested to make more inappropriate content, or the children realize their gymnastics videos are by far the most popular, so they make more, and more. They think they're making tutorials for other children, but the other side of the camera is just thousands of predators watching and pushing them to go further. Then they start trying to trade contact information with the children to get them off the site.
I don't know how you fix the problem, but it is a problem. We're talking about videos with millions of views here. It's not one or two people trying to exploit these children, it's literally hundreds of thousands, and I think the kids need some help protecting themselves.
So instead of trying to ban the videos and restricting user freedom to prevent the videos and deploying automation to recognize them, why don't we ban the predatory behavior and set system rules to prevent and/or direct AI at identifying that.
As I said, some of these children will make 50 videos with no viewers, but then they create a gymnastics or yoga video that goes viral with predators.
Even if the comments are disabled, these children realize what subject matter grows their subscribers and view count the fastest, so they keep producing more of those videos. Then another kid sees their channel, and notices their yoga video got one million views, so now they try making yoga videos to replicate it. To their surprise, they also gain traction, and the cycle continues.
So, even without a single comment, you still end up with thousands of children making tutorials on how to do the splits, while they cannot begin to comprehend what is taking place and how they're being exploited.
I'm pretty sure the phrase I used was “predatory behavior”, not “comments”, so I'm not sure what your point is.
Strange, because the post bugging the limitation of blocking contents identified other visible manifestations of predatory behavior.
Admittedly, they may be ones that are difficult to distinguish in a single act from acceptable behavior, but there is no reason an automatic detection system needs to consider each action independently in isolation.
edit: holy shit, i just watched the video report. YT in serious trouble.
Children have the right to privacy, and it is crazy how many parents do not think twice before sharing.
That being said, it's hard to really regulate this. When someone has a baby, they usually share a photo in a public manner of the entire family. It's hard to argue that parents shouldn't be able to share any pictures of their family in a public manner. It also would then prevent journalists or anyone for that matter, from taking pictures in public spaces that contained any minors... which starts to get absurd.
Hopefully my kids will appreciate that we never posted pictures of them publicly on social media (only shared privately within immediate family). At the very least, it's one less thing they'll have to worry about as they grow up.
The trick is to live like it's 1980. We have many, many pictures of our child, but only 10-15 on social media (of things like winning sporting events or the like). Everything else is in digital form on drives in our home server or printed in physical frames. If/when the child decides they want to do something with their photos, there they are.
Also handy to have on the home server because we can turn any screen into a digital frame on our network. That will be great when they bring their first boy/girlfriend home.
People just feel the impulsive need to share literally everything on social media. Everything is a competition for likes and attention. It's disgusting.
I'm also not talking about someone sharing a photo publicly with someone else's kid in the background.
I just mean as a parent you should need consent to share a video of your kids to a public social media channel. If your kid cannot give consent because they are underage or refuses, then you shouldn't be able to share it.
I'm afraid not a lot of parents/people would be willing to pay for such service, so I'm unsure this is going to happen any time soon.
In practice no politician will want to be on record as the cause of child safety ratchets being relaxed; and so there is no realistic path from the current environment to one in which a person may become free to act as they see fit with previously problematic images of themselves on reaching majority - the trend is towards such images being seen as problematic purely based on content, regardless of context.
Point is there should be a way to limit public sharing of photos and videos without a child's consent. There should be a good reason, beyond a parent's own narcissism.
There is an application in CA for an entertainment work permit, but that's mostly validating age and satisfactory education status (for school age children) and medical clearance (for infants under 1). So, I think the analogy doesn't work and you should work out what kind of application you mean here.
Given everything else YouTube already does, I seriously, no sarcasm, don't see this as a particularly problematic solution. As ceejayoz pointed out, there's some thorny consent issues involved here: https://news.ycombinator.com/item?id=19209766 (ceejayoz' point isn't much larger than my summary of it, but I wanted to give credit. I think it's a good point.)
Sometimes scale brings things like this. I have to admit giving a smaller, specialty service a niche to exist in doesn't exactly bother me, either.
It was instantly and automatically flagged, removed and a strike placed against me for going against community guidelines. I had 40 characters to appeal, but it is automated since the response was instantaneous. There are other videos of kids dumping water on each other ... even when both are in the tub. An email asking what guideline I violated to youtube support has gone unanswered.
To me, if Youtube is automatically, instantly flagging my video, there is no reason they cannot automatically flag and remove the videos in Matt Watson's video, especially if they are public and are attracting a wide range of comments.
Don't get me wrong, it's Youtube's pool, so their rules. They can choose the content they want.
But then you think they can apply the algorithm more broadly? Wouldn't that just risk many more false positives and irate users?
No. As I said, their rules. I have no idea if it is a false positive since I cannot get an answer as to what guideline was violated. I could have violated one of their guidelines.
Youtube doesn't owe me anything ... I use their service for free with unlisted videos to share with family. They make no money off the videos.
My point was if they can flag something like my video, I am sure they can flag the ones given as examples attracting the other people.
I can understand your point that false positives would generate more upset content creators who do make their videos public and would like ads run against them.
The first is a mere process of comparison, while the latter requires quite a few levels of subjective abstraction.
I feel you. Many of my videos have been flagged for copyright violations due to background music, but since they are unlisted and only for sharing among family, I'm ok with it.
There have been a couple of times they have muted a whole video for a snippet of music, but I've never had a video removed for music.
If you posted them publicly so Google could properly monetize them, they'd be "Featured videos of the month!" in a pedo group.
Isn't this punishing the users more than punishing YouTube? I get how these kind of videos would be banned in a country like Saudi Arabia, but in the USA, or much of Europe? Yes, CP should be disallowed, but kids playing in a pool adequately covered?
As I sort of alluded to already, giving other services an incentive to exist, or incentivizating more self-hosting, I see as a virtue anyhow. YouTube shouldn't be The Video Site On The Internet.
Other services may or may not fill the huge created gap, but what’s to say they won’t be subject to the same regulations as YouTube? Anyways, the politicians would probably take a huge fall if they did this, voters don’t often buy “its for your own good” arguments.
I don't think there's an ideal solution. But they'd need some mix of detecting videos that would be prone to these kinds of comments, then subjecting the comments on those videos to higher scrutiny, such as stricter automatic detection/deletion/*(EDIT - Shadowbanning or hiding the comment as a "first pass" seems good) or even manual review.
Manually checking everything is impossible without severely hampering the freedom and capabilities of the platform, and automatically detecting everything is not possible without accepting a heck of a lot of false positives.
But putting more effort into a mix of the two would be better than what they appear to have been doing so far.
This is very significant. It becomes 1000X more sinister here - it goes from "innocently upload family videos to share" into "steal family videos that are popular with perverts, post them yourself and monetize them".
It will be very difficult for YouTube to separate legitimately uploaded family vids that attract perverts from scum who use stolen "innocent" content to pander to perverts.
If one channel has a different set of kids every video, maybe not okay.
Preventing reuploads of videos that aren't explicitly flagged as eligible for that ought to be a trivial application of tools they already have for copyright enforcement, so that aspect of the problem should be trivial to solve.
Everyone is part of the problem, starting from the parents who push their kids into the public spotlight, to the local media that provides coverage of this stuff, encouraging other fame-seeking parents to try to get in on it.
For your example ("the same kind of videos my own sisters would shoot with our dad's camcorder") there are two remedies:
1) Upload the video as unlisted, so it will not appear in recommendation systems and you can send the link to your friends/family.
2) Disable comments on the video.
Public content is a bit more dangerous to leave unchecked in 2019 than in 2009.
The suggestion here is that Youtube should do something about it, and I'm asking what that solution specifically looks like.
This may be an awkward position to take in public, but I'm not comfortable with the idea of criminalizing "creepy comments", especially on the internet.
What sort of advances have happened since 2009 that would explain this?
To me, the seminal event marking the transition has to be IMDB shutting its forums. Public discussion and discourse is now seen more as a liability than something of value.
What about the rights of the children? Maybe they should have the right to wait and decide, upon adulthood, whether they want to share these videos rather than have their parents blithely allow weirdos to leer at and comment on them.
Pretty sure none of the videos I have posted have a billion views.
And a creep in real life can easily follow you home.
It wouldn't surprise me though - perhaps sexual variants just tend happen in that rate window... chemistry doesn't care whether the particular variant is ethical or not.
I suspect "people spectating at gymnastics meets and pools" is a not a random sample. Some will be relatives or friends of the participants, but who are those others?
They could also detect links to compromising timestamps in the comments, take them down, and shut down the accounts that post them. It's not particularly challenging.
(I used to work as an engineer on YouTube's abuse team many years ago.)
That's a really cool idea.
My personal opinion is that it's the other guys problem, if they want to sexualise images of my kids, if I have to self sensor, we've already lost. But as my partner points out principles don't count for much if you've got pedos trading pictures of your kids, so.....?
I see people posting photos and videos of their children in public Instagram accounts all the time, and I think that's a terrible thing to do. Not because creeps might get off on them (though of course that's bad), but because it shows that the parents -- who have been entrusted to make decisions on behalf of their minor children -- are showing a distinct lack of respect and care for their children's privacy.
I don't have kids, but if I eventually do have any, I will never post photos or videos of them on IG, FB, YT, etc. I might privately share some things via better-protected, private-by-default channels (email, messaging apps, possibly private sharing on something like Google Photos), but that's it.
This isn't about self censorship; this is about understanding what these public services are about, and making responsible decisions about what I do and don't post.
That's actually creepy.
I don't think we can stop pedos leering at kids online. In person you can tell them to get lost or call the cops.
The worst part is that I disagree with his claim, or rather I don't think he makes a strong argument that these videos are _intentionally_ sexual. It looks to me like a kid doing kid stuff. I might have made the same videos at her age _but_ it is still sexualized by some people.
I am not sure what's a good solution to that could be. According to these new articles, it looks like any video showing a kid is potentially sexualized.
Banning all kids on youtube look a bit extreme and would raise its own set of problems. And at this point, just having a kid in any movie/series has the same consequences
So yeah .. not sure we can stop that. And if you are sharing timestamp of kids doing squats, you are probably too far gone that just asking you to see a psychiatrist is enough.
The only thing I see wrong here is the sexualization, but it'd be us doing it if we assume the reason of the timestamp. Compare: Video of people playing soccer, someone adds a timestamp to a part when a guy passes the ball to another guy, would you think the timestamp was created because of sexual arousal and tell they need to see a psychiatrist? How is kid doing squats different?
These videos exist, people like to watch them, but I don't see people assuming the intent is for people that have some cat with yarnball fetish to share it with others that do, as the context. Nor were the children eating the lollipops intending for anything else than having fun. There's nothing wrong with either, other that people assuming why people upload videos, watch them, timestamp them, or share them.
In reality, I could have in my mind someone that gets aroused by... maraccas (I don't know, it's the most mundane thing I could think of). There's a maraccas video, and there's a comment with a timestamp that says "this is why you're here", so in my mind, there's this group of people that get aroused by maraccas and this comment is for them. The intent and context is clear. All this fabrication is not less real that yours.
Chances are there's a guy out there that actually matches the pedophile in your mind, however, they're not wasting their time clicking timestamps on youtube. If you research porn, people get adapted and need worse and worse things to "get on". The actual pedophile stopped being impressed by such timestamps content long time ago.
But imagine: store employees make a peep hole in dressing rooms. Sure parents help their children change, depending on age, it’s all innocent, but an unsupervised uninvited stranger is a little different, right?
Same with these kids’ videos. Now of course a good question is who’s uploading, thd kids or the parents? Parents should know better than make family vids avail to all, Google should do a better job of figuring out if a kid’s old enough to upload if the parent isn’t involved in this.
They may even default to “private”/unlisted if they can auto determine the uploader is a kid, and advise parents if content, though innocent, contains minors (this should be default for parents anyway because although they have right to make decisions for their kids, they should think hard before putting their kids out there on the internet, if nothing else for the privacy of their kids')
I don't have an answer, but considering the size of Disney and Nestle cumulative marketing budget, youtube engineers will be hard pressed to do something.
This isn't a new problem as their comment subsystem has been a usability nightmare in years. When a comment thread geta to a certain length, following the conversation and pinging specific replies become increasingly difficult. The late John "Totalbiscuit" Bain had to turn off comment on all his videos and direct all discussions to Reddit, because once the comments start to get rowdy it's impossible to moderate.
Anonymous comments are always going to go down that path. The way Facebook slowly released and tried to keep all accounts real has contributed to the way it's network works. We see other sites trying to do this like OKCupid which tried to force people to use their real first names.
I personally think there is something that's lost when you lose that anonymity. There is a reality people let themselves be, and there's something valuable that can be learned about society which is overshadowed but just calling comments a cesspool.
Some people think youtube comments are complete trash. They usually watch videos of very large channels. Other people think youtube comments are dull but not particularly horrible but rarely insightful. It's common to hear this about channels with a million subscribers or less. But below 100,000 subscribers, in the long-tail of youtube, you'll often find that content creators enjoy having reasonably constructive conversations with their viewers in the comments.
There are always exceptions, but these are the general trends I've noticed.
Examples of the first category, the very large channels, are easy to find. Pewdiepie, and anybody like him, have awful comment sections and I think everybody knows it. I subscribe to several very small channels with only hundreds or thousands of subscribers that have great comment sections, but you'll have to excuse me for not sharing them here. In the middle range, I can give you a few examples. Matthias Wandel (woodworking) and Forgotten Weapons (history and overview of rare/historical firearms) both recently passed 1 million. Their comment sections are rarely insightful, but typically not offensively bad. Mostly just people making the same obvious joke over and over again, or people trying and failing to be insightful.
There is a big chiasm between censored internet (Facebook, newspapers) where everything is polished but only reflects 25% of the population, and free internet (the rest) where you get an actual image of diversity. It’s just that you don’t like this diversity and you want them not to exist, but their points are, if not correct, at least vastly unadressed because the visible society despises them.
I defend men’s rights. All of MR forums are underground. They’re not happy, they’re unadressed and despised by society. And that’s 49% of population. Saying men’s rights are a non-issue which should not be adressed and deserves the censorship it has in the above-ground internet is, by nature, discriminatory towards men. It’s just an example, but you won’t be able to keep the pressure cooker very long.
>It’s just that you don’t agree with it.
Correct, I don't agree that it's ok to refer to groups of people as "jobless apes," and I despise the mindset required to dehumanize like that. I find it disgusting. I also find it irrational and dangerous, and I want it to go away. Is this a "just"? I am happy to dismantle any "advantage" of a racist mindset, if you'd like. It simply does not need to exist - the exist of racism is a failure of society. Every member of society, I believe, has a responsibility to make it Go Away. That includes youtube.
> but only reflects 25% of the population
I am surprised to hear you suppose that 75% of the population is, for example, racist. I strongly reject this hypothesis but welcome your evidence.
> an actual image of diversity. It’s just that you don’t like this diversity and you want them not to exist, but their points are, if not correct, at least vastly unadressed because the visible society despises them.
"just," again, implies that my dislike of this "diversity" and my desire for it to not exist, means it should? Racist points should remain unaddressed - they are wrong. They deserve no more attention than someone saying "there is a dragon in my garage, you may not see it but you must worship it with me" - riddled with fallacy and taking no energy to generate, these ideas and concepts must be rejected with as little energy as it took to generate them, or we open ourselves up to "death by bullshit." A DDOS of bullshit.
Society should despise many of these concepts. Racism is stupid, pointless, a waste of energy, and immoral. That someone that is racist gets despised by association is a shame - I wish we could all be Dale Carnegie. Punish the sin not the sinner.
>And that’s 49% of population
I reject the notion that all men are Mens Rights activists - that idea isn't even kosher by Red Pill values, because "cucks" exist in their value system, which obviously aren't men's right's activists. Furthermore, my own experience disagrees with the supposition - I have met thousands of men, and about two men's rights activists. On the internet, I only find them when I hang out in 4chan or the boards made for them on reddit. 49% of the population is "underground?"
The phrase, "men's rights activism" does not draw meaning from the words alone - the definition encompasses more. I posit that equating the censoring of Red Pillers is not equivalent to discriminatory actions towards men in general.
Once the number of comments reach a tipping point it's the time to bail out.
EDIT: some of the worst I've seen are usually involving Micky or Minnie mouse doing drugs and other things, while containing a bunch of keywords intended to get into recommendations.
There is no need for YouTube to have any video of my children a pervert might want to watch.
> Ban kids uploading a video of their innocent pool party because some creeps might enjoy it? What about topless babies?
The children are unable to agree to the terms and conditions set forth by YouTube, and sadly, there is a possibility that their parent or guardian is aware of the exploitative nature of the community.
So, perhaps yes?
> The children are unable to agree to the terms and conditions set forth by YouTube
Which means that children should not be able to upload any kind of video to youtube, but it should be fine if they upload it to say an ftp server for public domain videos that does not impose any terms and conditions, right?
That'd likely be seen as an overreach now, but twenty years from now we may wish we'd done that.
In the meantime, making them only visible to an approved list of friends/family seems like a reasonable step for video of or published by minors. Maybe you put together a "child actor" exemption where they parents can moderate it for public access.
Comments like, from "predator133," "@2:33 is what we call came here for," linking to a part of a video where a child is doing the splits or something.
It sounds more like something someone who wasn't a pedophile would post, because they were trying to get the video taken down or to bait real pedophiles into posting agreement.
There are neo-Nazi parades of people not afraid of being hated, but there are not, like, "I cheat on my taxes" parades. Too much to lose.
And there are a lot of hoaxters, trolls, parodists, and false flaggers out there. Often more of them than the deviants, because the deviants are weird and underground.
This is a great test to see if something is dirty or if it's only dirty in your mind. If you have to create an imaginary pedophile in your mind that gets aroused by @2:33, and a whole group of them that are searching Youtube comments to find the timestamp for "what they're looking for", you have to wonder if you're not using some huge leap in logic to arrive to conclusions.
Ironically, what this video has shown is that Youtube's system works, as this guy, to make his video, had to go for what seems to be hours of research and after it, he shows the most shocking videos of little girls he was able to find. And there wasn't any nudity, or anything actually implicitly sexual shown. Everything was clean and every timestamp led to family-friendly content.
The "worst" thing I saw was the usage of the water emoji, and I'm like, really? People use the water emoji and suddenly you can be very detailed about their sexual orientation and why they timestamp videos or share them? All what is happening on these completely innocent videos could be explained in many ways, thinking the most likely explanation is "that's a pedophile" means the imaginary pedophile in your mind needs to be toned down.
Why do you believe the innocent angle is more likely? Do you know people that make a habit of linking timestamps of children doing yoga poses?
There are a ton of weirdo creeps out there and putting your child's private videos on Youtube of all places seems like terrible parenting.
Like most people here likely do, I run tech support for my entire extended family, and a very very common theme is trying to share photos/video with family and friends.
Apple locks down iCloud so it's only easy to share with other iDevice users, and YouTube makes it REALLY easy to initially upload a video (drag onto this box, or just share from your phone/tablet video browser) but then doesn't clearly explain the different options in the processing screen, etc etc.
They also don't understand the repurcussions. They're not on reddit reading about Elsagate, they're not aware of the deep seedy underbelly of the Interwebs outside of "people buy drugs on the darknet" because they saw it on the 7:30 Report. They think the Internet is Facebook, Google, YouTube, and Amazon.
It's extremely easy for technically proficient people to underestimate how complicated and overwhelming these "simple" (for us) tasks can be to non-techies.
I know people in my family that didn't even know YouTube had comments. They watch YouTube on a smart TV and just use the built in search tool, and all videos play fullscreen. This doesn't fit with the above family sharing thing but it shows how little lots of people understand YouTube in general.
Youtube simply needs to move to a proper gatekeeping model around monetisation, requiring human review and biannual review checks. Start it at 10,000 subscribers to avoid being overwhelmed. Prompt once-off approval for 'viral' trending videos from new channels (even if the advertising money goes completely to Youtube).
The internet is maturing and existing gatekeeping models are too lax. Same thing with games allowed onto Steam. Now any idiot with a phone can upload something - previously you needed a computer and decent knowledge to do that.
Limiting new uploads from new accounts to 720p30 max until they hit 1,000 subs or pay a $100 'starter fee' will save on storage and processing fees.
I don't understand why Youtube (and other tech platforms) sets the barriers to entry so low, then inundate themselves with work. Simply raise the barriers until your human-approval processes can handle the volume.
Youtube should have wisened up and become a lot more selective of which videos were supplied with ads.
And if running slightly fewer ads is too commecially difficult, then save costs by limiting new uploads to 720p.
Have you seen this -- https://youtu.be/M78rlxEMBxk?t=364 ?
Instead, everyone (like every media outlet and internet comment) casts a vote against "child exploitation". Who wouldn't? And the discussion stops there. "Whatever they're doing, it needs to stop!"
For example, I bet most people aren't thinking "whispering girl" or "family video of daughters playing in the pool" when they hear the phrase "child exploitation". And maybe most people think Youtube is no place for those videos, fair enough. But the taboo subject and the lack of specifics creates an incredibly charged and blind environment that makes it hard to have a real discussion.
So let's be specific here: what kind of action should be taken against this whispering girl's video and what kind of policy does Youtube need to have at scale?
Is that video sexual content? And how would you suggest encoding it in a policy change? Obviously, "I'll know it when I see it" doesn't scale.
That the solution must scale algorithmically is purely for the convenience of the site, and for matters of law and taste nothing but an excuse.
Even having different countries with different laws and standards doesn't scale. So what?
Then we inject judges and juries into the process to make them even more subjective.
Every country has some limits, and standards for movie ratings and age restrictions etc.
If some legislation cannot be so rewritten, it should be thrown out. Yes, that goes for anything with "vulgar" or "indecent" in the title.
Speaking of standards for movie ratings - they aren't law in US, just industry guidelines. So they're subjective, but people have a choice of following them (e.g. because they believe the board that rates stuff has the same subjective opinion of what is "inappropriate" as they do) or not. Similarly, movie makers can, and do, release unrated movies. Since it's all voluntary, there's no problem here.
But what I can tell is that video I linked above... I'll give 99% probability easily, that this video was made for purpose, with sexual context in mind, most probably by some adult person, not another kid, and it is not "just kid having fun", and it's not "family video".
This is softcore erotica with 12-14 y.o. girl, and it was made to be a softcore erotica. There is literally zero other reasons to shoot video like this.
is also true.
Without sex or nudity, how do you purpose you'd make it more similar?
Establishing a criteria for pornography is hard, but the usual metric seems relevant: designed to sexually arouse and otherwise devoid of value. But roleplay ASMR content presumably isn't designed to sexually arouse, and has other value for those who experience a physiological ASMR response from it.
How about explicit language?
>and getting so close cannot be accidental
Probably because sexual arousal and ASMR are both physiological responses.
Can you point to another piece of media, say something on the Disney Channel, where adding a few curse words would turn that production from innocent children's entertainment into soft core pornography? If not why is that a unique feature to this particular video?
Let me put it this was if I was an evil person and my goal was to create a legal facsimile of child pornography I don't know that I could do better than this video did at that stated goal. Perhaps legally it doesn't cross a line but can you honestly say that you'd leave your daughter in the care of the person who made this video to do the same?
> I don't know that I could do better than this video did at that stated goal.
What is this supposed to establish? That creeps can get off on this video? Creeps get off on mundane things all the time. It establishes nothing.
>Perhaps legally it doesn't cross a line but can you honestly say that you'd leave your daughter in the care of the person who made this video to do the same?
Since the artists in the video are generally those creating it, sure.
Presumably you mean would I leave my daughter in the care of someone who enjoyed such videos. If I knew them, sure. If I didn't know them, it wouldn't really add much information. It would slightly alter my prior against them, but that's not saying much. In the space of non-pornographic things that creeps would get off on, this video surely ranks higher than average. But this isn't an indictment of the content.
I'm sorry If you think that's "kids having fun" you belong on a cross.
Edit (more details on exactly why this is wrong): It's shot in the style of POV porn videos that are popular today, the driver is male (why not female if the purpose is not sexual), his name is used infrequently as possible (don't want to ruin the immersion of the viewers), she holds up handcuffs and suggests that she "might have to use them on you", the outfit either doesn't fit or is purposefully open in her chest area, she is wearing exaggerated makeup and has press on nails in a manner fitting of a central casting prostitute.
It's about as extreme as you can get without the FBI coming knocking on your door. I'm a atheist and I'm praying that someone, somewhere in law enforcement is keeping a close eye on her. Poor girl.
I realize some of these kids make tons from ads on these videos .. but at some point ... that's just bad parenting. Like what are you teaching your child?
If you're under 16, your content, and the uploading of it, should be supervised. Why are we letting our children run free on the Internet? You wouldn't let them run free on a highway and the Internet is just as dangerous but in different ways.
I believe YouTube need to implement a system of age verification for content creators. 16+ minimum if you ask me. If the parents do the uploading, whelp, you found your issue (if any arise)
Kids here roam across the city unsupervised by age 7-8. This includes highways. Also, correct me if I'm wrong, but doesn't YouTube require you to be at least 13?
You need to be over 13 (or above, depending on the country) to create an account. There's no limit on the content itself.
In my opinion, there is no justifiable use case in which videos of children under 13 should be made publicly-available, but YouTube itself disagrees. Case in point: Ryan ToysReview has 18 million subscribers. YouTube thinks it's okay because the videos are uploaded and edited by his parents, while I think it's absurd that anyone makes an income based on sharing videos of their children publicly.
Movies and TV shows starting children are not unheard of as commercial endeavors. I'm not sure why with YouTube as a viable monetizable video platform, parent-driven productions on that platform are fundamentally any worse of an idea.
There might be a time in which we reject such content just as we did with prior widely-accepted behavior that was immoral (like homophobia and racism). Future generations might look down on us because of child celebrities (among other things), and I sincerely hope they do.
You’ve presented no argument as to why it's not alright. Now, this may be a base moral axiom for you, such that no argument is needed, but that's not particularly convincing to anyone who doesn't already take it as a moral axiom.
> There might be a time in which we reject such content just as we did with prior widely-accepted behavior that was immoral (like homophobia and racism).
There might be a time when we see anything in any way you can imagine. That's not an argument that we should.
There might be a time
I also notice that video has 10 times the number of views as the rest of the videos of her, presumably because of the title.
I had a couple videos from when she was younger and wearing diapers and a shirt when she was wading in the bay, and got a comment from someone who wanted to know what brand of diapers she was wearing. (wtf?) I deleted the comment of course, but a while later I see that YouTube has deleted the video for "inappropriate content" (double wtf). I mean, she was more covered up than if she was wearing a two piece bathing suit.
The thing is, even if someone is a perv and gets off on this sort of thing, it isn't endangering her. I'm not going to lose sleep over it. If I am really that worried about protecting her from pedophiles, I probably shouldn't take her anywhere in public, where someone could follow us home or otherwise directly cause her harm. I'm not going to live my life in that kind of fear.
Why do you need to post publicly rather than marking it as unlisted and sharing the link with people you know and trust? Would you personally like to have a video posted of you in your underwear? Even if that's okay with you, perhaps your daughter will grow up preferring to have a more private life, and to keep things like that off the internet? But you've taken that choice away from her.
One of my favorite articles on this topic: https://slate.com/technology/2013/09/facebook-privacy-and-ki... (and a follow-up, after the author was informed that she had unwittingly allowed some photos of her daughter out onto the internet: https://slate.com/technology/2013/09/privacy-facebook-kids-d...). I don't know that I'd go as far as she has, creating a sort of "digital account trust", but I do like the approach of keeping a kid's private life off the internet until they're old enough to decide how much they want to share.
In what sense "child predators" are an issue? Because they leave disgusting comments?
> perhaps your daughter will grow up preferring to have a more private life
... or perhaps she will grow up preferring to get as much publicity as possible. But if her parents do not publish that video on YouTube - she would lose that opportunity.
In any case, this publishing decision is in hands of her parents, and clearly, parents made up their mind (and decided to publish that video).
I don't "need" to post publicly but I also have no reason not to. I'm not going to let people having creepy thoughts control my life, unless there is an actual, tangible danger of harm.
Honestly, if I was worried about creeps, I wouldn't let her wear a skirt in public. I see other girls at the playground all the time that make me wince a bit...I'm far more likely to dress my daughter in pants. But at the end of the day, I'm 1000 times more concerned about people in real life than on YouTube....because they are right there.
I don't exactly care myself but I do appreciate that my kid isn't capable of consenting to having his childhood published. So I don't do it.
It seems really odd that you're reason for making that call is, "meh, why not?"
Maybe the sleepy baby girl from https://www.youtube.com/watch?v=KTCQpjUrCe8&index=147&list=L... will grow up to be really shy and feel that wasn't her best angle. That strikes me as a personal hangup, not some constitutional right we should all be protecting her from.
It's funny how heated I can get about the notion "there are people whose reaction _isn't_ 'meh, why not?' What's wrong with them?" I'm like a militant meh-why-notter.
(thanks for your support, btw...)
I think about things she might actually be concerned about, but I see no reason she should be concerned about it, so no I'm not going to worry about it. If I was concerned, it might be about "should I put her in a dress or have her wear pants?" Because if she is wearing a skirt, someone could see her underwear, god forbid. YouTube or not.
At least on YouTube, it's pretty hard to follow someone home like can happen in the real world (e.g. Jayme Closs).
You know what I spend my efforts worrying about? Her getting hit by a car. Drowning. You know, real things.
Privacy is a real thing, and worrying about it isn't mutually exclusive with any other concerns. I'll be honest and say I'm somewhat disturbed by your whataboutism and nonchalant deflection of the fact you're pervasively invading your child's privacy and posting it online for all to see, consent be damned. Yes, you "see no reason" to be concerned, and that's your choice to make as a parent, but at the same time it's an extreme stance you've taken when, as other commentators have pointed out, the safest and most logical thing is to be more conservative with sharing such information.
Sure, but you need to balance things. Having someone on the internet see a video of her, fully dressed, and getting aroused by it, doesn't exactly compare to any number of real issues. As I said elsewhere, if I actually was concerned about her privacy so much, I'd go to a bit more effort to avoid having her wear a skirt or dress when she is playing. That seems like a good first thing to worry about.
Although not many people see the videos I post of her, she was thrilled when a nearby YouTuber (a 7 year old girl who actually has a significant number of subscribers) found her via this video and we'll hopefully get together for skateboarding together. So there are positives about putting them out there.
Meanwhile, I've dealt with my daughter's mother for years, as she freaks out over the most ridiculous tiny irrational fears, while nearly killing our daughter due to her recklessness in other ways.
You may think I have an extreme view, but you know who has to be concerned about being seen as having extreme views on things like this? People who are in contested custody situations, where their ex hires aggressive, unethical lawyers willing to grasp at any straw to tear down their parenting in a court of law. All the more so if that person represents themselves rather than having an attorney.
And that's actually me.
I assure you, if posting YouTube videos of my daughter playing at playground was something they thought they could use against me, they would have. But they didn't. And I won about the strongest decision you're going to find, and was awarded primary custody.
So yeah, I reject your premise that I have extreme views on this issue.
Clarification, though: I was talking about you-the-adult, not some hypothetical video of you from when you were 3. Maybe you would be ok with that. That's fine. You're an adult and can decide for yourself what you want to post online about yourself. (I personally do not want videos of my 37-year-old self in my underwear on the internet, but that's just me.)
Anyway, as I initially said, I'm not even talking about creeps here. I, personally, as an adult, would not want photos/videos of me as a child floating around on the internet. It's not a matter of embarrassment or safety, it's just that I don't want my life to be public to that degree.
But hey, you can always argue that I'm an adult, and I didn't grow up with always-on internet connectivity (or, really, much in the way of internet connectivity at all until I was in high school), so maybe my opinion doesn't count, because culture and social expectations are different now.
But... they're not, really, at least not universally so. There are people who are children right now who are uncomfortable with this. The linked article also mentions, for balance's sake, some kids who enjoy having a ton of information about them online. But that's the thing: you can always put more information about yourself online, but once it's there, it's very hard (sometimes impossible) to take it back. Parents who share things about their kids online rob them of the agency to decide for themselves how private they want to be. That's just fact, regardless of what you've decided.
Not always realistic. If you record some boyscouts event, you probably don't know the name of everyone there, let alone account information. People often email a link, and then people share it by forwarding it.
Alternatively, of course, you could upload it to some website for your organization, but again it'd generally need to be public, or you have the usual IT horror of getting everyone an account and dealing with people who say they can't log in. There is a reason people just toss it on YouTube.
Important point, still kind of weird to me, even in this day and age of Facebook and social media in general, where everyone feels the need to share every intimate detail of their lives with strangers.
The content of your videos (based on the example you gave) are harmless. Adorable, in fact. You have a beautiful daughter - congrats. Sadly you're missing the point that you did the uploading, not her.
The article is about children, I'd say around 11-14 in age on average, possibly late teens too, uploading videos of themselves... themselves. And the videos are adorable videos of them down the park in jeans, they're videos of them in bikinis in their bedrooms. Some of them are videos of them sucking on lollipops... I mean really?
Also, the videos being uploading are flags. They're used to signal to pedophiles: there's other content available outside of this video. They're guiding lights for people who wanted darker content.
Your videos are just your videos, and you moderate them. It's not the same thing at all.
Creep!!! Just kidding.... Thanks, she's the best. :)
Point taken. I get a bit defensive here and there, and I guess what I'm more defensive about is reflected more here in the comments, where people are so quick to shame parents for not being so worried about things that I think are non-issues. (note that I am a bit of an outlier at Hacker News, in that I am in general not particularly concerned about privacy...in the sense that I don't worry about, say, Amazon following me around the web because they have figured out that I am in the market for digital pianos and skateboard parts)
But... it's a video of her, not of you. Is she going to lose sleep over it if in five years she goes on YouTube and discovers someone has cloned and monetized the video, and a bunch of creeps have commented on it?
Someone could video her at a playground, and post it on YouTube. Should I not take her to the playground out of fear of that?
Honestly, I think I'm probably more concerned than most other parents, since they put their daughters in skirts and let them play on the monkeybars, while I tend to dress her in pants when she is going to be doing that sort of thing. But it's probably more out of concern for other dads, who (thanks to all the hand wringing about this sort of thing) tend to feel like everyone is on the lookout for someone who is creepy.
I mean, I don't care. She's just playing at the playground. I didn't think about it at the time, I guess I could, but I don't see why it matters. If I was concerned about anything, it would be about creepy people seeing her in the real world.
If I was worried about the public seeing stuff, I'd just make things unlisted, and sometimes I do.
Careful with that idea, it's some pretty enablist shit, and you most frequently hear it coming from the perverts themselves.
And no, being lumped in with pedophiles matters. I am trying to warn him who he sounds like, not making a rational argument against his position. It's up to him to decide if he wants to sound like a pedophile.
Given that, if I sound like a pedophile, to me that says more about the listener than the speaker.
It wasn't, and I was awarded primary custody.
All I said was "be careful" and now you're giving me your ex's lawyer's contact info. What a weird escalation!
This can only go badly for you, I hope you realize...
The Internet, in general, is a "risky" place for a child, always has been and always will be.
But trying to "child-proof" this whole thing will just completely destroy what little remains of its original idea.
Doesn't YouTube by now have curated categories with specific themes? Imho it shouldn't be that difficult for them to offer something like that.
And yes, YouTube is full of weird semi-soft-core content, which I consider quite amusing and innocent in its own way. It's kind of cute to think there are so many users on there who seem to be completely incapable of finding actual porn on the Internet, and instead need to use YouTube as the digital equivalent of a Gloria's Secret underwear catalog.
I can see someone preferring to watch two attractive women massage each other on YouTube over the content found on porn sites.
Other than that, honestly, the only population I can come up with who would be watching these on youtube instead of an actual porn site would be kids.