
YouTube Is Facilitating Sexual Exploitation of Children. It's Being Monetized - tomcam
https://m.youtube.com/watch?v=O13G5A5w5P0
======
komali2
Reddit's having a field day with this:
[https://www.reddit.com/r/videos/comments/artkmz/youtube_is_f...](https://www.reddit.com/r/videos/comments/artkmz/youtube_is_facilitating_the_sexual_exploitation/)

Many have come to the conclusion that YouTube is both aware and unwilling to
do anything about it, because money. Fairly silly, but I do wonder how they've
let it get so out of control that someone could make a video like this. Seems
like allegations of child pornography should set off alarm bells at every
level of the company.

Edit: been going through some of the YouTube comments. Some are painfully
overt. Someone named "the predator" linking timestamps and saying "what are
all want." Gross.

~~~
tomcam
I have to admit I’ve been afraid to look at any of those links and end up on
some kind of list.

------
komali2
I feel like the video format is neutering the response on hn, which is a shame
because I'm very interested in the thoughts the people here have. I'm cross
posting the video creator's description below:

>Over the past 48 hours I have discovered a wormhole into a soft-core
pedophilia ring on Youtube. Youtube’s recommended algorithm is facilitating
pedophiles’ ability to connect with each-other, trade contact info, and link
to actual child pornography in the comments. I can consistently get access to
it from vanilla, never-before-used Youtube accounts via innocuous videos in
less than ten minutes, in sometimes less than five clicks. I have made a
twenty Youtube video showing the process, and where there is video evidence
that these videos are being monetized by big brands like McDonald’s and
Disney.

>This is significant because Youtube’s recommendation system is the main
factor in determining what kind of content shows up in a user’s feed. There is
no direct information about how exactly the algorithm works, but in 2017
Youtube got caught in a controversy over something called “Elsagate,” where
they committed to implementing algorithms and policies to help battle child
abuse on the platform. There was some awareness of these soft core pedophile
rings as well at the time, with Youtubers making videos about the problem.

>I also have video evidence that some of the videos are being monetized. This
is significant because Youtube got into very deep water two years ago over
exploitative videos being monetized. This event was dubbed the “Ad-pocalypse.”
In my video I show several examples of adverts from big name brands like Lysol
and Glad being played before videos where people are time-stamping in the
comment section. I have the raw footage of these adverts being played on
inappropriate videos, as well as a separate evidence video I’m sending to news
outlets.

>It’s clear nothing has changed. If anything, it appears Youtube’s new
algorithm is working in the pedophiles’ favour. Once you enter into the
“wormhole,” the only content available in the recommended sidebar is more soft
core sexually-implicit material. Again, this is all covered in my video.

>One of the consistent behaviours in the comments of these videos is people
time-stamping sections of the video when the kids are in compromising
positions. These comments are often the most upvoted posts on the video.
Knowing this, we can deduce that Youtube is aware these videos exist and that
pedophiles are watching them. I say this because one of their implemented
policies, as reported in a blog post in 2017 by Youtube’s vice president of
product management Johanna Wright, is that “comments of this nature are
abhorrent and we work ... to report illegal behaviour to law enforcement.
Starting this week we will begin taking an even more aggressive stance by
turning off all comments on videos of minors where we see these types of
comments.”1 However, in the wormhole I still see countless users time-stamping
and sharing social media info. A fair number of the videos in the wormhole
have their comments disabled, which means Youtube’s algorithm is detecting
unusual behaviour. But that begs the question as to why Youtube, if it is
detecting exploitative behaviour on a particular video, isn’t having the video
manually reviewed by a human and deleting the video outright. Given the age of
some of the girls in the videos, a significant number of them are pre-
pubescent, which is a clear violation of Youtube’s minimum age policy of
thirteen (and older in Europe and South America). I found one example of a
video with a prepubescent girl who ends up topless mid way through the video.
The thumbnail is her without a shirt on. This a video on Youtube, not
unlisted, and is openly available for anyone to see. I won't provide
screenshots or a link, because I don't want to be implicated in some kind of
wrongdoing.

>I want this issue to be brought to the surface. I want Youtube to be held
accountable for this. It makes me sick that this is happening, that Youtube
isn’t being proactive in dealing with reports (I reported a channel and a user
for child abuse, 60 hours later both are still online) and proactive with this
issue in general. Youtube absolutely has the technology and the resources to
be doing something about this. Instead of wasting resources auto-flagging
videos where content creators "use inappropriate language" and cover
"controversial issues and sensitive events" they should be detecting
exploitative videos, deleting the content, and enforcing their established age
restrictions. The fact that Youtubers were aware this was happening two years
ago and it is still online leaves me speechless. I’m not interested in clout
or views here, I just want it to be reported.

