
YouTube bans comments on all videos of children - _wmd
https://www.bbc.co.uk/news/technology-47408969
======
jawns
Chris Ulmer, who runs the popular YouTube channel Special Books for Special
Kids, said that comments were removed from his channel despite no evidence of
unacceptable comments, and YouTube told him that if he turned comments back
on, he risked his channel being deleted. [EDIT: Turns out, this is not
actually the case. See child comments.]

[https://twitter.com/chrisulmer/status/1099366622329036801](https://twitter.com/chrisulmer/status/1099366622329036801)

"Last night I realized all of the comments on SBSK's YouTube channel were
disabled. I saw I could manually turn them back on so I did. Then I read a
post by YT saying that by turning comments on I risk our channel being
deleted. I love and respect YT but IDK what to do.

"The beauty of SBSK is the love and acceptance in the comment section. It
shows families and individuals across the world that their [sic] are people
who accept them. Many people I interview have been socially isolated. Comments
can change their self perception."

So it sounds as if YouTube content creators are now in the unenviable position
where they need to actively moderate the comments section for videos featuring
children, and if they don't do so to YouTube's satisfaction, they could have
their entire channel nuked. Even if you're pretty darn sure that your
commenters will behave themselves, that doesn't sound like a good deal.

Seems like YouTube will need to come up with some sort of "trusted subscriber"
designation, and allow content creators to permit comments only from those
subscribers, so that any random bad actor can't swoop in and destroy a
channel.

~~~
xfitm3
I might be in the minority here but I enjoy the low quality youtube comments.
Most people dismiss them as cancer but the reality is a lot of people think in
patterns that drive these comments.

I would rather be in touch with and exposed to this rather than try to pretend
it doesn’t exist. It won’t ever go away, it will just be hidden.

~~~
disillusioned
I think that there's an element of validation and indoctrination that is a
serious concern here. In addition to an overall significant lowering of the
discourse around things by allowing garbage quality trolls and other horrible
comments, and in addition to bullying, aggressive behavior, stalking, and
other things that are broadly considered unpleasant, this kind of behavior
breeds more of this behavior.

Consider someone growing up on the internet. The more they're exposed to this
sort of content, the more it will be normalized in their mind. The more
they're exposed to this sort of content, the more likely susceptible people
are to be radicalized by it, and to grow into the same sort of troll. Hiding
or suppressing low-quality content creates a herd immunity effect. It prevents
a shift of the Overton Window to where "generally allowable discourse"
suddenly includes timestamping the most "salacious" parts of a child's video,
or telling people to actually kill themselves, or sharing their purely racist,
hate-filled viewpoints.

While you may rather be exposed to it because you have the strength and
capability to view it as a curiosity and a sociological study, the
impressionable among us maybe deserve to not be bombarded with garbage, and we
may have a moral duty to at least make some effort to minimize indoctrination
and radicalization on these platforms.

~~~
colechristensen
This is one of the reasons I think anonymity on the Internet is a bad thing.
Not that I am saying it should be banned, but it should not be the norm. Much
of the toxicity simply would not happen, could be prosecuted, or could be
filtered if a "real person" attribute was widely available.

~~~
joe_the_user
Toxic comments can be just as common in non-anonymous forums and venues.
Facebook produces a lot of toxicity, for example. Further consider; the
Internet is much less anonymous then twenty years ago and that hasn't stopped
the overall level of toxicity imo. The type of person who spontaneously makes
toxic comments spontaneously will still make them when forced to use their
real name (they'll just suffer more from it). The provocateur, those who
calculatedly elicit toxicity, is always going to be here too - no matter many
Russians Facebook filters.

One factor is that once one person begins attack another person, both using
real names, both people have a hard time backing down, especially if they know
each in real life or if they are semi-public figures. For a lot of people,
admitting that they are wrong is a huge hurdle - and these tend to be the
people who engage in toxicity in the first place.

~~~
colechristensen
...and people have been jerks since there were people, I am not implying that
people being unpleasant was created by anonymous commenting.

However there would be quite a bit of backpressure against certain kinds of
Internet toxicity if people were less anonymous.

------
zaroth
The fact that Google is ceding the fight against toxic comments on YouTube is
actually pretty shocking.

In a company that knows more about you than you can possibly imagine, and more
about automated sentiment analysis than anyone in the world, they couldn’t
algorithmically determine who should be allowed to post on certain subsets of
videos, or devise a system they thought was worth deploying to ensure comments
meet a basic level of decency.

~~~
_wmd
The ramifications of this are profound, at least:

\- shake the money stick at them and they will dance (thanks, of all people,
Nestle!)

\- they've admitted a serious problem exists that they were unwilling to deal
with until external pressure forced them to (i.e. they can't be trusted to
self regulate)

\- they've all but admitted they can't fix this in reasonable time, if at all

This is the first time I can think of where there has been a seriously
material chink in Google's.. cultural armour? Turns out the advertisers are in
control, and turns out they don't have a technical cure all. It'll be
interesting to see how they attempt to reintroduce comments in the long term,
no doubt more ML. Of course, this says nothing about a recommendation system
that continues to blindly cluster videos of lithe toddlers together. I wonder
if any advertisers are making a stink about that

Favourite summary: kids are safer on YouTube today because of Disney and
Nestle, not because of Google. Let that sink in. The subtext here of course is
that Nestle and Disney are some of the most evil companies around, and yet
they're the ones that were forced to strong-arm Google. The irony of this
defies words, and the reality of the only mechanism at play here to protect
children is almost as disturbing - these companies don't "care about
children", they were only forced into action to maintain their reputation.

(gentle reminder: HN punishes highly commented 'controversial' stories. If you
care about this issue being more widely understood, try to limit your
commenting)

~~~
theNJR
I, for one, do not want Google deciding what is true.

~~~
brentonator
God forbid they put a link to Wikipedia under flat earth videos, huh?

~~~
mediocrejoker
You only think that's a good idea because you don't believe in flat earth.
They could equally put a link to flat earth sites under NASA/SpaceX videos. Do
you really want to decree that whatever they link is "true" just because you
agree with them in this particular instance?

~~~
ben_w
When I am objectively wrong, I want my mind to be changed. I expect this to be
the case for around 10% of my ‘knowledge’, even on topics I care to educate
myself about, and much worse on other issues.

Putting a link under the videos isn’t likely to achieve that, but that’s a
separate issue.

The only problem I have with Google doing this, is that I trust corporations
and government about the same — i.e. that both will lie and dissemble as much
as they are allowed to get away with for their own or their leader’s benefit,
without regard for my interests.

~~~
Mirioron
You're missing the point. Most things in life aren't settled as easily as the
flat Earth debate. Google could also do this on something that's far more
controversial (eg political) and justify it in the same manner. Imagine if
Google were against climate change and every video about climate change gets a
link to some website that says it's not true. That wouldn't be acceptable,
would it? But being okay with Google doing this for flat-earth also makes it
more acceptable for Google to do it for climate change.

~~~
throwawaymath
_> But being okay with Google doing this for flat-earth also makes it more
acceptable for Google to do it for climate change._

Sounds great, I'm all for it.

Sure the truth can be complicated, but I fail to see how implementing software
that auto-links to a relevant article on Wikipedia causes Google to be the
arbiter of truth.

~~~
Mirioron
> _Sounds great, I 'm all for it._

Even if Google thinks climate change isn't man-made and links to sources
backing that up?

~~~
mrep
Do you think YouTube should have censored videos about government surveillance
programs before the Snowden documents came out?

~~~
throwawaymath
That sounds like a leading question for rhetorical purposes - is this
something Google actually did, or are we speaking purely of hypotheticals
here?

Note the solution I posed is something which Google _already does_ on its
search engine without calamity. Therefore I don't see a reason why it would
fail for YouTube. In contrast the example you're giving seems pretty hard to
just link to an authoritative source.

Put another way, I'm not advocating for Google to arbitrate the truth on a
case by case basis. I'm advocating for Google to identify _ahead of time_
which sources are well-researched and trustworthy, then _outsource_ its fact-
linking system to those sources.

If Google were to supply facts on a case by case basis that would be suspect.
But that's not how the company operates, so I'm deeply skeptical they would
become some kind of arbiter of truth.

------
minimaxir
Official statement: [https://youtube-creators.googleblog.com/2019/02/more-
updates...](https://youtube-creators.googleblog.com/2019/02/more-updates-on-
our-actions-related-to.html)

It's not "all".

> _A small number of creators will be able to keep comments enabled on these
> types of videos._ These channels will be required to actively moderate their
> comments, beyond just using our moderation tools, and demonstrate a low risk
> of predatory behavior. We will work with them directly and our goal is to
> grow this number over time as our ability to catch violative comments
> continues to improve.

~~~
golemotron
YouTube steadily moves toward the TV station model.

~~~
MBCook
Honestly is it so bad if most videos end up turning off comments in the
future?

I use a blocker to hide comments and don’t seem to miss much. Let people
discuss videos on other sites like twitter or reddit.

~~~
derefr
> Let people discuss videos on other sites like twitter or reddit.

Yes, exactly. I've never understood why content-host sites seem to think
attaching first-party comment hosting directly to hosted content is a good
idea.

Why not just host what you host, and then let aggregator/discussion sites
(various subreddits, various private forums, Usenet, Slack/Discord groups,
etc.†) link to your thing-that-you host and host comments? You can have RSS
feeds to ensure those aggregator sites can—if they want—automatically generate
their own link-posts when your site has a new post.

And, best of all (from my perspective, at least), it doesn't force people who
are consuming your content for different _reasons_ into the same room where
they will inevitably shout over one-another due to competing access needs.
Each community can have their own conversation about your content, and can
feel morally justified in kicking out uncivil people, since they can just go
find a _different_ conversation about the same thing in a _different_
community. Rather than there being one "canonical" conversation that they feel
shut out of.

† I exclude Twitter from this list because Twitter has no concept of
partitioned subcommunities, and so inevitably, if there's a conversation about
something on Twitter, it becomes the "canonical" conversation. That's kind of
the point of Twitter, for some use-cases (e.g. celebrities arguing with other
celebrities where other celebrities can watch and get pulled in), but it's bad
for the use-case of regular people trying to have a regular discussion.

~~~
est31
There's been a video on yt about a real airline pilot reviewing the sully
movie [0]. The comment section is filled with people from the field, pilots,
turbine technicians, etc. Even Sully's Co-pilot Jeff Skiles HIMSELF commented
on the video. While there are usually tons of low quality "upvote this comment
if watching in 2019" things, this thread was wonderful. And I think if the
comments were hosted on a different site e.g. some pilot specific website
(there are probably such), or even twitter, I think I wouldn't have found
those comments.

[0]:
[https://www.youtube.com/watch?v=hXfWU_ER3Mg](https://www.youtube.com/watch?v=hXfWU_ER3Mg)

~~~
derefr
Were there any valuable comments that weren't root-level (or that wouldn't
have been made as root-level comments if it were impossible to reply to
comments)?

I ask because I feel like you're describing the same sort of system I am in my
sibling reply below:

> But really, if content creators want to get "feedback", they just need a
> thing that's essentially like email—a separate private channel between them
> and each audience-member—but where you can easily forward a whole email
> thread, after the fact, to the content host, and it'll appear below the
> content in an FAQs section. Sort of like how reviews work in some digital
> storefronts.

If all such comments were submitted knowing they'd just be treated as "private
feedback" by default, with the moderator having the _option_ to promote such
feedback to appearing as a public "comment" [which, yes, isn't that different
from "comments with pre-moderation", except for different _expectations_ on
commentors' parts], do you think that (valuable!) thread still could have
happened?

~~~
est31
Yeah, Jeff Skiles's comment got a lot of replies and he replied to some of
them. This spontaneous AMA thing was definitely valuable, made the guy even
more approachable!

Cramming everything into one flat thread has already been done in phpbb and
works for small threads but big threads get into issues when you are not
interested in every part of the discussion but only some.

~~~
derefr
How about a modification of my proposal:

1\. the creator has their own private forum (e.g. a Discord group);

2\. you, as a potenial commenter, can easily join said group as a guest, with
your YouTube credentials used to identify you in the group;

3\. _or_ you can also just post "into" the group _without_ joining it, using
the form below the video;

4\. the community itself can be deputized to moderate such comments;

5\. the community can have all sorts of thought-provoking discussions in
response to such comments, visible to one-another in the (private) group;

6\. the content-creator would have an easy interface to "highlight" any given
thread/subtree of conversation from the group, making it into a visible
comment thread appearing below the video.

Essentially, this proposal would just take each YouTube channel, and make its
comments section into a private forum under the content-creator's control; and
then, separately, make the "comments section" below the video into a sort of
moderated "best of" version of that forum's discussion of the given post.

~~~
kkarakk
just promotes a "goto reddit/discord" mentality which hurts discussion once
reddit/discord cracks down on your interests.

------
Sargos
What a ham fisted reaction to the controversy. Now all content creators have
to ensure that no kids are ever in their videos, which includes background
footage and movie clips.

I think this is a major problem with our current centralized model of the
internet where giant corporations make all the rules. Large corporations have
no ability to rationally judge situations or inject nuance into discussions.
They simply take the easiest and most expedient pathway to make the problem go
away regardless of the long term consequences. This is not behavior we want
from the few major building blocks of our modern life.

~~~
Mirioron
I disagree that this is a problem with giant corporations. I think the problem
lies in the public that demands blood on everything. There is no solution to
the problem that youtube is being asked to solve other than simply banning all
videos with kids in them.

~~~
pcurve
"public that demands blood on everything"

public = Small, vocal minority.

~~~
judge2020
Not even the public, it's the small group of news outlets which are trying to
discredit youtube as a source of news and entertainment. Youtube is a direct
competitor to cable news and the networks that host them, so being able to
say, "Look! There's bad stuff that they didn't know about!" is a great way to
push that narrative and reclaim some advertising dollars.

This isn't even in retaliation to the Verge and the other news articles that
originally covered the story, it's due to the advertisers that pulled out of
Youtube. There's very little chance the advertisers decided to pull ads
because they have the 'moral high ground' \- big corps like ATT and Disney are
simply doing this so that, after the issue has passed, they can negotiate
lower ad rates with Youtube. If there were no ads whatsoever for "Avengers:
End Game" on youtube, the movie would still be popular, but it wouldn't nearly
generate the same amount of buzz or 'hype' as the first one did.

~~~
CM30
This sort of stuff and the fake news hysteria in general seem like an attempt
to discredit all alternative media sources, and the internet/blogging/free
speech in general. For them, competing with thousands of people writing/making
videos for free means their business model doesn't work any more.

Fact is, they won't stop with these articles and attack pieces and hysteria
till the internet is basically like cable TV.

------
m23khan
As a Father of very young kids, I think this is nothing short of emergency.
Not only commenting on kids videos should be banned, also youtube should
heavily vet and tag all their kids related videos -- these days kids are
hooked on watching cartoons and poems on youtube and a lot of times, there are
really garbage videos targeting kids just showing little girls buying and
playing with make ups, cleaning houses and learning about shopping.

~~~
drak0n1c
People said the same about the chat rooms we used when we were kids. You could
argue that those had much more potential than Youtube comments to lead to
actual physical harm. In some cases, bad things really happened. Chris Hansen
made a show about it, and police used chat rooms to find pedophiles. But were
internet chat rooms ever an "emergency"? Where do we draw the line with bans?

There are family-friendly content creators who diligently police their
comments sections who now have their livelihoods severely impacted.

~~~
Wowfunhappy
I'm mostly playing Devil's Advocate here, but consider that Youtube is _much_
more widely used than IRC chatrooms ever were.

~~~
hbosch
In addition, Coca Cola never ran ads on IRC spliced into channels (as far as I
know ;)).

------
Posibyte
This seems like a ham-fisted solution does it not? Channels like ChadTronic
play on the nostalgia of being a kid back in the '80s and '90s, yet AFAIK has
been unable to enable comments on any of his videos. The weird, funny comments
are a big part of the experience in those kinds of videos. Some tropes of the
show itself have been built around the comment section.

Would it make more sense to instead target the patterns used by predatory
commentators rather than shutting down the system completely? This is Google,
the company of bringing meaning out of arbitrary data, is it not possible to
build social graphs of what people like, scour, and activate these time codes?
Couldn't you restrict the features that enforce those patterns?

Lastly, does forcing these comments off negatively impact the rankings of
these creators? Comments have traditionally played into the engagement of any
said video and had an (understood) impact on how a video ranks on release. Are
these channels now just permanently stunted in their future growth?

~~~
pgrote
>Lastly, does forcing these comments off negatively impact the rankings of
these creators? Comments have traditionally played into the engagement of any
said video and had an (understood) impact on how a video ranks on release. Are
these channels now just permanently stunted in their future growth?

1) Yes. 2) Yes.

Youtube has taken this action without an overall plan. One of the Youtube
creators was told as much in a chat with youtube support.
([https://youtu.be/oeI0-ijIotk?t=504](https://youtu.be/oeI0-ijIotk?t=504))
They indicated to him that some creators will be negatively impacted until
everything is worked out. The program is working just like they planned. lol

~~~
badfrog
> The program is working just like they planned.

Or like they didn't plan

~~~
asdfasgasdgasdg
Or ya know they thought the situation was sufficiently urgent that they needed
to do something immediately without thinking through every possible
eventuality. Have you never dealt with a PR emergency before? Often, the
"let's take two months to game this out" approach is not the most effective
way to address the problem.

------
michaelmrose
Man people are depressing. I'm guessing the scum bags will have to post scummy
things on other forums and link to videos. At least the consumers on youtube
wont be subjected to such.

Incoming new challenges for reddit. Maybe its an opportunity to get rid of
some more scummy communities.

~~~
kabacha
You are depressing. You are validating blanked "solutions" that harm the
creative content and at what cost?

This obsession with regulating and controlling everything that happens on the
internet is just hysterical. Also "Think of the kids" is the most common of
the "appeal to feelings" falacies that keeps on getting hammered and hammered
and it's _depressing_ to see people fall for it.

~~~
michaelmrose
It would actually be more optimal if they simply removed youtubes comments
feature. Have you ever tried to interact with youtube comments?

------
jumpman500
I think the time is ripe for a YouTube competitor to take chunk of its market
share. YouTube has an impossible task of making a video hosting platform for
everybody. Kids definitely need a particularly safe space to upload and to
have supervision of some sort. Even adults don't necessarily want to be
subject to the weirdness of the internet. Others really want a completely
free, wild and weird platform. It doesn't make since for one platform to try
to cater all these groups.

~~~
theNJR
My conclusion, after writing and thinking about this a lot lately, is that the
three mega platforms are unwise, but unavoidable, due to network effects.

We are going to have to make a concerted effort to somehow fix this. Hyper-
niched communities, like Hacker News, work so well because we are united
around shared passions and goals.

Mega platforms contain many ideologies which are at odds with each other,
constantly battling for their truth.

Never before in history have competing tribes had to share the same space like
this. It doesn't work.

~~~
zanny
Facebook et al go out of their way to silo its users into small cliques of
influence. On Youtube if you watch gaming videos you get ads for gaming, get
recommended gaming, and aren't going to see ads for makeup or power tools.

Any open access community will attract hostile actors seeking to usurp it. The
problem with Facebook / Youtube / et al is that these corporations are more
interested in trying to find a technical way to automate moderation than to
actually moderate and police their platforms to weed out of the bad actors.

It isn't something that is inevitable or even made worse by scale, its just
these companies are operating to maximize profit, and needing human faces at
screens to moderate public comments is a substantial expense.

~~~
nilskidoo
I bet it's more psychologically-based than that. From collegiate safe spaces
to news media bubbles, to the social networking accounts themselves where
nobody honestly follows people or ideas they themselves object to, everybody
wants their own gated community/echo chamber.

Personally, as big as I am on free speech and anti-censorship, I don't believe
that anybody should be allowed to post pix of minors online under any
circumstances, even parents harmlessly to their personal FB profiles. UNLESS
it's posting old pix of one's self. Otherwise I feel it betrays the privacy
rights of those minors. They may well feel differently when they come of legal
age, and by all means they can post whatever they like of themselves after the
fact, but in the meantime privacy should be the autoset.

------
40acres
Does YouTube even need comments? I can't remember any meaningful discussion
happening in the comments, it's mostly meme type stuff. Long term I can see
this being the final resolution for toxic comments.

~~~
pgrote
The most useful comments I've found on youtube are for home improvement or
repair. People chime in with other ideas or better ways of doing things.

~~~
pwython
Exactly, on a lot of technical/tutorial videos you'll see helpful discussion,
like people asking questions that weren't covered, pointing out flaws, etc.

------
Wowfunhappy
Legitimate question, does this apply to any video in which a minor appears for
even a very brief period? If you're recording an hour-long video podcast, and
a child of one of the hosts runs through the room in the background for one
second, does that entire video get comments removed?

~~~
kabacha
It doesn't matter.

Youtube videos are a business to some people so anything that has _potential_
to harm revenue will go straight out of the window.

"Is it worth to include this kid in my skit if it will reduce the revenue by
10%?"

Combine this with 10s of other already existing hypotheticals and you have
yourself a neutered vlog machine that is today's youtube.

~~~
Wowfunhappy
> "Is it worth to include this kid in my skit if it will reduce the revenue by
> 10%?"

Yeah, that answer is easy—it's not worth it. I'm more interested in what
happens if it's accidental.

You mention vlogs. If a "vlogger" is recording themselves walking down the
street, and a kid passes into the frame, what do they do? Do they need to
scrap the clip and start over? Do they attempt to edit out the kid? Do you
just never record in public anymore? Actually, that last option seems the most
practical, but it robs the whole piece of its intimacy.

If Youtube elects to take the stricter approach to this, it's going to have
unintended consequences that I'm not seeing anyone else talk about right now.
I almost wonder if it would be better to remove public commenting entirely.

~~~
kabacha
My vlogger example was rather lazy. I'm not really sure how to define the
current meta youtube content but "creative" is definitely not a word would be
using for majority of it.

I mean every youtuber thinks about how to avoid being demotized or just in
general reduced profits.

I think your point about public filming is very valid as well. Even if the
risk that your video will get caught in the filter is <5% it's still not a
risk worth taking.

I watch a lot of skating videos and un-surprisingly enough they feature kids,
sometimes as main subjects but often in the background. Filming at skatepark
is suddenly a considerable risk because a falling kid might evoke some sexual
conotation in extreme minority.

This is a lazy solution that unfortunately just be adapted into the collosal
folder of other lazy youtube solutions until this medium loses the last bits
of creativity it had.

------
notTyler
1\. YT has problem that generates a lot of bad press and loses money/face.

2\. YT develops either easily-gameable or far-too-heavy-handed automated
solution to said problem, because they can't afford to actually police their
content.

3\. Passionate creators (that don't get struck down mistakenly) have to jump
through ever more hoops, while ad $ flocks back to YT.

4\. Repeat steps 1-3.

Granted, I don't really see a solution, as their's just straight up too much
content and it's ever growing as the barrier for entry is just an internet
connection.

------
scarejunba
When we were young, we had the Internet and we'd share source code and run our
tiny BBSes and then our geocities pages and enjoy sharing. We told each other
how when we were older and in charge we'd have such free expression. As 11
year olds we discovered goatse and survived bypassing the 18+ porn warning.
We're adults now and we go on all right but we made one big mistake.

We let you censorbots take over. Good job. You took this beautiful thing we
had and you ruined it.

And now you're coming for everything under the guise of taking care of the
children. It took so much work to get you guys to leave video games alone and
now you're here for everything else.

How did we let you do this?

------
clusmore
I would love to see a solution for this in the browser. If there was a tag, or
some metadata attribute that indicated where each part of the document
originally came from, you could just configure your browser to not render
parts you don't trust. So in this case, YouTube could just flag the comment
section as being provided by other users, and you can choose to block it if
you don't trust other users to provide appropriate content. I would love to
say that websites would flag ads as coming from a 3rd party ad company, and we
could choose to block those too, but of course they wouldn't.

~~~
lucb1e
Other than the logo on top, the menu, the footer, and your own text in the
search bar, what content on YouTube would _not_ be covered by that tag?

------
zapdrive
I wonder how long before they totally disable all comments. I mean I mostly
watch all my YouTube on my smart TV, and as far as I know, I can't read any
comments on the TV. Additionally, I can't even read the description of the
video, or the tags, or even the number of likes/dislikes. So I guess that is
the direction YouTube is headed to.

~~~
MBCook
Same thing if you’re watching on a phone, at least outside the app. I can
either watch the video OR read the comments.

Guess which one interests me more.

Between that and watching on my TV comments barely exist for me anyway.

------
cmroanirgo
It seems to me that yt could've implemented the perfect honey trap for pedos.

Make an inappropriate remark on children and (being Google), track you down
using a variety of fingerprinting techniques and report you directly to the
appropriate authorities of that country. The latter part could probably even
be automated.

And yet, they've taken this stance: one that puts the onus on the content
creators themselves to moderate the comments. Unfortunately, the pedos are
still out there, being predators.

~~~
djsumdog
They already do this and so does Microsoft. Many companies have shared image
finger-prints (hashes of images that can deal with resizing) and special
people granted permission by the Dept of Justice to report such images. It's a
small number of people, leading to at least one Microsoft employee filing a
lawsuit for the PTSD he got having to verify and report illegal images.

As far as these YouTube videos, Google does get rid of all content with clear
abuse and reports it to the authorities in the US and possibly the origin
country. For this particular case, none of these videos have illegal content.
Most of them are videos of teens filming themselves. It's others who go
through and place them into a creepy context by grouping all of them together
in playlists and via comment-chain links.

Creating any type of honeypot is most likely unethical and illegal. Just look
at what happened to the FBI in the Playpen case.

~~~
Wowfunhappy
> It's others who go through and place them into a creepy context by grouping
> all of them together in playlists and via comment-chain links.

Wanted to pull this out as (IMO) the most important part of your comment.
There's nothing for the police to go after, because nothing illegal was
happening here. The videos themselves were 100% benign, and while the comments
were creepy and awful, they were also free speech.

~~~
lucb1e
So they cannot be censored, sure. But can they not be probable cause for a
temporary tap on their internet connection? And if that reveals stuff like
https connections to unpopular sites whose content can't be viewed publicly,
Tor traffic, maybe torrent traffic, then that might be suspicious enough to
bust the door while the connections are going on.

I'm against dragnet surveillance and all, but comments that are clearly
predatory is a red enough flag that even I start to think it warranted to
escalate step by step, and continue depending on what is found in each step.

Of course, this entirely depends on whether the comments are "oh look at that
sweetie, probably has a nice puss" or just "oh look at that sweetie"
interpreted the wrong way. I'm on the second page of HN comments and read
three linked articles about it, and nobody mentioned what it's all about other
than "predatory comments on videos involving minors".

Edit: As expected, Dutch news coverage is more explicit than prudish American.
Example comment is: "$time $camel_emoji toe... Then again at $time". Another
example: "Hi honey baby where r u from??". Last example is just "Love you"
with a bunch of emojis like hearts and presents. Another website mentions the
kids were called "godess" or "barbie". This sounds like it would be
extraordinarily easy to just build a blacklist of words that trigger new
comments to land in a moderation queue (processed either by youtube itself or
by the video owner)... not sure what the difficulty is exactly. It seems
harder to detect kids in videos than to build a blacklist of these comments,
except that the former can be automated and the latter is ongoing moderation
(probably too simple for Google: "if you can't automate it, it can never
scale, so it can never be a solution").

------
post_break
I'd like them to take a look at predatory selling of merchandise to children
next.

~~~
pbhjpbhj
You know they're an advertising company? Unless you're offering ways to
improve their predation on consumers I can't imagine they'd care to hear what
you have to say.

~~~
post_break
No I'm talking about Jake Paul and the likes telling kids they literally won't
be cool unless they buy his merch or attend his "events".

~~~
pkaye
That is where parents should do their job.

------
rvanmil
> "better protect children and families"

They're not protecting anyone, they're just sweeping the problem under the
carpet. The real problem is naive/ignorant/reckless parents allowing kids to
upload their stuff for the world to watch.

~~~
icebraining
I'm pretty sure many don't even know; after a certain age, kids can be pretty
sneaky if they want. We only had a single home computer, on the living room,
yet I still accessed plenty of shit my parents have no idea. And nowadays,
there's always those school friends with smartphones...

~~~
Mirioron
And do you consider that you turned out in some particularly bad way?

~~~
icebraining
Well, certainly not due to anything I might have seen online.

------
circa
Just ban all the comments. The world would be a better place.

~~~
anonytrary
That's the easily solution, but the easy solution is hardly ever the right
one. There are plenty of reasons why commenting on videos can be useful.
Lectures and other educational content typically have errors and other useful
information as the top comments. Those are useful.

------
saagarjha
> YouTube told the BBC it would use algorithms to detect which videos
> contained children.

I guess we will have to see how well this works out. I can foresee someone's
legitimate video having its comments suddenly removed causing uproar…

~~~
tengbretson
Man this is going to kill Michael Reeves' channel.

~~~
thanatos_dem
That child does things that are far too dangerous for not having parental
supervision.

~~~
zanny
That "child" is 21 this year.

------
Wowfunhappy
I think this should be the original source, instead of the BBC?
[https://youtube-creators.googleblog.com/2019/02/more-
updates...](https://youtube-creators.googleblog.com/2019/02/more-updates-on-
our-actions-related-to.html)

------
lbacaj
I’m going to say something that will likely get me downvoted like crazy here
on HN, to salvage some karma let me first say that I love the first amendment
and free speech.

Now, as a parent I am absolutely disgusted that there was recently discovered
a YouTube “gang” pushing comments that were promoting pedophilia, in plain
sight, tagging comments so algorithms could pick them up and other folks could
easily find them.

Secondly I am disgusted by the fact that there are videos embedded within
Peppa pig videos on YouTube, in the middle, pushing kids to harm themselves
and it is impossible to detect them. Google these things they are all real.

Look I get it, I’m a parent, I’m responsible for my child so the YouTube app
is gone, she will not get access to it until she’s old enough. That still does
not make this stuff ok, these are terrible human beings creating this stuff. I
applaud YouTube for stepping in but I still won’t install their app anymore,
this does not mean it is ok for their platform to become a great place for
that sort of stuff no platform should ever serve that.

When folks on here comment and say one parents voice outnumbers 100 others,
why the hell not if this content is targeted at our kids?

Edit:

Exhibit A:

(This one is apperantly fake news, sorry. I was fooled myself a few days ago.)

Exhibit B: [https://medium.com/@jamesbridle/something-is-wrong-on-the-
in...](https://medium.com/@jamesbridle/something-is-wrong-on-the-
internet-c39c471271d2)

~~~
Dayshine
Why is Exhibit A a hoax that never harmed any children?

[https://www.bbc.co.uk/news/technology-47393510](https://www.bbc.co.uk/news/technology-47393510)

I'm not sure what point you're trying to make here.

~~~
lbacaj
While “momo” May be a hoax there are videos with adults in them embedded
within these cartoon videos.. that was just the first result I got when I
googled, while there may be a ton of fake news these days these mashups I have
seen myself .. sorry if I hurt my own point with that first link, but the
point is very valid.

[https://medium.com/@jamesbridle/something-is-wrong-on-the-
in...](https://medium.com/@jamesbridle/something-is-wrong-on-the-
internet-c39c471271d2)

------
danols
Unsurprising, lazy & coward reaction from Google.

That said, not that comments will be missed. Been using the extension
"Distraction Free for YouTube" for the last couple of weeks which includes an
option to remove all comments on all videos. Plus every other distraction.
Been enjoying my YouTube time a lot more since i started using it. Amazingly
clean & refreshing experience. Now I am on the hunt for similar extension for
other parts of the web.

------
bilbo0s
Is this something that can be put on the shoulders of the video uploader? Make
it incumbent on the uploader to moderate their comment sections instead of
just yanking the comment sections altogether?

I understand YT's concern. The Feds find child porn discussions on your
website, they're not gonna simply ignore it, any more than they ignored
"escorts" on craigslist and backpage. I get that part. But instead of going
down this road where you just delete it for everyone, tell people that they
are now responsible for the content on their videos. We find these child porn
talks on your videos we yank your channel, and report it to the Feds. Simple
as that.

I don't know? That probably has some drawbacks too, I don't know? I'm just
trying to find _anything_ that can be done to protect us, not necessarily from
the douchebags, but from the worsening consequences that the douchebags seem
to have on everyone else in the community.

------
raldi
Here's another signal their abuse team could be looking into: On videos where
a lot of people are following links to a specific timestamp, grab a couple
frames starting at that point and run them through the AI to see if they
contain children, and if so, put them on a review queue that's sorted by
"popularity".

------
xenadu02
Google manages to impose the Google Death Penalty on people for supposedly
violating AdSense terms or having an unpaid Google Cloud account or any number
of other reasons.

Is there some reason they can't require a verified account to comment and nuke
any account found to be making inappropriate comments? That sure would put a
stop to it.

------
lifeisstillgood
I am reminded that we are still doing a distributed web all wrong. We don't
need comments sections

Bear with me for a moment - but PageRank is the start of the problem. It's
basically measuring link distance from Stanford University web site - the
closer you are the more juice you get. But it assumes that a single domain is
a single publisher (and so all articles on a given site should get the same
google juice - the same rising tide)

But facebook (and many many other sites) _lend_ their domains google juice to
all comers - so that if you rationally want greatest reach (ie highest rank in
search engine) you publish not on your domain but on facebooks (or medium
etc).

But this leads to many unintended consequences- the facebook moderation
problem being one.

it if we see facebook not as a monolith but as firstly a web space provider,
then a indexer over that webspace and then a algorithmic feed provider, we see
dramatically different ways of dealing with the problem - if we should give
people webspace to put their kids pictures and share them (facebooks main use
case arguably) then "normally" that space would be under the users own domain.

So why not have a more granular pagerank? Why not have google assign juice to
facebook.com/paulbrian seperate to juice assigned to facebook.com/paulgraham.

Then we see no reason to host the webpages under facebook.com - if I need to
"earn" my google juice just as i would on a normal domain it's no benefit

I see comments sections in a similar vein - having my comments stored on
youtube es pages / domain is just not how it should be - they are my comments
so they go on my server / web space and they link up to the video

That way google juice tells us which people are worth reading and which are
hate filled garbage - based on their reputation.

I need to flesh this out some more but we do seem to have gone wrong

------
jennsang
This is true, as it doesn't need to be a matter regarding only the creeps. I
was thinking sociopolitically, where adults related or not make the decision
to use minors as props, whether for a thing as big as ad campaigns or as small
as for social brownie points. As a grownup, if someone were to post my image
without my express permission I could pull some legality in my favor. Children
generally don't have the same resources, lacking their own guardianship. I'm
just saying that until they can legally make such decisions for themselves,
privacy should be the standard, to save them from embarrassment or abuse. We
must all decide for ourselves what to share of ourselves.
[http://www.samadhantutors.com/](http://www.samadhantutors.com/)

------
hilbert42
If not a good idea then it seems to me to be a very practical one (despite it
being censorship).

As nowadays no matter how carefully one chooses one's words or how innocent
one's thoughts are about such matters, someone somewhere will misconstrue them
and take offense.

Shame, but that's the way of the world these days.

------
miguelmota
Another example of a big corporation trying to be the world's parent.
Censoring does more harm than good.

------
mhuffman
... yet they are still perfectly willing to allow content from children to
generate revenue off of them!

------
ltbarcly3
This is just idiotic. Rather than removing videos that creeps are into, or in
any way addressing the situation where their platform is being used by creeps,
they are just removing the evidence that creeps are there so we won't be able
to embarrass them by pointing out the creeps using their platform to be
perverted creeps.

So the creeps will keep creeping on kids, and nobody will know except Youtube,
since they have all the logs. Nobody will be able to screencap an
advertisement on a video with some pedophile's comments anymore, so the
problem is solved as far as Youtube is concerned.

~~~
jake_the_third
Just how would you solve this if you were in their position?

~~~
novia
Why not just prevent the use of timestamps on videos that contain children?

Why not say "you cannot view this video if you are over 18" unless the watcher
is specifically added to a list of allowed viewers?

Why not require the watcher to enable a webcam & mic while watching videos
containing children and create an algorithm to detect wanking?

There are lots of solutions.

relevant xkcd: [https://xkcd.com/2106/](https://xkcd.com/2106/)

~~~
jake_the_third
> Why not require the watcher to enable a webcam & mic while watching videos
> containing children and create an algorithm to detect wanking?

Just how far are we willing to go to make our selves feel safer?! If you think
that recording viewers is an acceptable and proportional solution to this
issue, then your sicker than the creeps who get off videos of children.

I sincerely hope you're joking.

------
lovelearning
I feel this is unfair to content creators and to the subset of viewers who
comment decently (which I think is actually the majority of viewers).

I know of innocent content creators who have faced this, although their
content was not child-oriented and showed their own children for barely 10-15
minutes across their total content time of 500+ videos. Banning comments has
consequences such as viewers not being able to express sympathy or
appreciation when it might help. As a viewer, comments introduced me to many
new channels and concepts and cultures - all that is lost now.

This is not the right way IMO.

------
megous
Now noone will be able to verify for themselves the scale/severity of a
comment problem anymore, or learn from it. The evidence is gone.

Might have been an interesting thing to dive into for a while, look
dispassionately at, think about, and to learn something. But no.

All you can get now is some dramatized version of reality from third parties,
who can barely distinguish between crime, pedophilia, child abuse, and
inapropriate comments online, or be bothered to investigate the problem in any
depth. Perhaps it's too soon for that.

Meanwhile, outrage won again.

------
pysxul
This is leading for some to crazy situation: I am following this guy who lived
in china and make video about culture shock:

[https://www.youtube.com/watch?v=oeI0-ijIotk&ab_channel=laowh...](https://www.youtube.com/watch?v=oeI0-ijIotk&ab_channel=laowhy86)

He happened to be married and have children. Youtube disabled all comments on
all of his video because like one or two feature his kids ... But 99% of his
content doesn't feature them

------
sgt
It seems YouTube is doing a lot these days to the detriment of creators and
vloggers. YouTube is by far the best platform out there for posting videos,
but are there any viable alternatives? I think we need another YouTube. Vimeo
comes to mind but it's a bit fringe (mostly weird videos), and I'm
specifically looking for creator videos. Maybe this is a startup opportunity
for an eager HN-er, although I suspect running a video site is insanely
expensive.

------
GreenToad5
Are they going to require an ID system so the age of everyone in the entire
video can be verified? I guess 2/3 of YT vids will have comments disabled?
What about all those super popular channels featuring family vacations? Not
even documentaries can have comments?

I honestly don't believe that this headline is true, but if it is... Good
opportunity for a YT competitor to spring up.

------
Zelmor
Just remove children altogether from Youtube and be done with it. Fallout 1
was better without kids (stealing from you in Arroyo), so will Youtube.

------
enturn
I don't have the stats to back this up but I imagine there would be a
significant portion of unsavory comments that appear from people/bots that
didn't actually watch a large portion of the video. I think it would work in
YouTube's and creator's favor to disable adding a comment until you watch a
large portion of the video or you load all the ads.

------
ars
Seems to me requiring the uploader to approve all messages would work just as
well without cutting video creators off from their audience.

~~~
redwards510
You think giving the attention-seeking 13 year old uploader the ability to
approve comments that praise her ability to lick a popsicle is a solution?
These kids don't understand what is going on.

------
GreenToad5
If only they could develop some kind of machine learning AI tech that could
make highly suspicious comments require approval.

I guess they just aren't sophisticated enough to do that... But I guess they
must have AI that can identify kids in videos accurately (even 17 1/2 year
olds).

------
swayvil
Ah yes, halting conversation for our own good. We see this on reddit all the
time. The conversation gets hot so the moderator locks the thread.

How does this help anything? I've actually asked. And I never got a straight
answer. I figure it's some kind of moderator-loss-of-control-anxiety-reaction.

------
theNJR
We are in the midst of an ideological battle. Network effects combined with
tribal conflict are unleashing something insane.

I've been writing a longform article about this since Monday, and it seems
like twice a day something new comes out like this. Hard to keep up.

------
hindsightbias
Have their been any studies on comment quality at free vs subscribed (for
money) sites?

Perhaps Patreon should just add comments and subscribers can comment there.
Content creators could link to it and it would be read-only unless you pay.

Most edgelords aren't going to pay to play.

~~~
egypturnash
Patreon has comments already, fwiw.

------
cmiles74
This doesn't seem to be getting a lot of play, but isn't it important to note
that YouTube has been generating ad revenue by serving this content to this
audience? It seems like they are only stopping this practice because the
amount of revenue that they will lose from Nestle, Disney, etc. exceeds
whatever amount of revenue they had been generating by aggregating these
videos and serving them to this specific audience.

Preventing the aggregation and serving of this class of content seems like the
larger problem.

It's also not clear to me if the removal of the comments will prevent the
aggregation of these videos. My suspicion is that even with the comments
removed, these videos will still be grouped together and will still garner the
same undesirable audience. I don't see any talk of somehow clearing whatever
history Google has collected that has caused them to be clumped together in
the sidebar.

------
mrweasel
How is it possible that it's easier for YouTube to identify children (people
under the age of 18) in videos, compared to identifying certain types of
comments and automatically moderate those?

Or is it some law regarding censorship that's the issue?

~~~
rossdavidh
Children's appearance is not likely to change, in ever more sophisticated
ways, so as to still appear like children to humans but not to the algorithm.
Thus, while not intrinsically harder than text analysis, it could very easily
be the case that it is easier to flag videos as having children (and thus turn
off comments) than it is to automatically police comments.

Note that they said it won't apply to videos of teenagers; my guess is that
this is because it would be more difficult to automatically distinguish
between teenagers and young (or just young-looking) adults.

------
madrox
Everyone seems to be blaming Google, but can we acknowledge that this wouldn't
even be a problem if people weren't jerks? The only reason Google even has to
do anything is because people are posting predatory comments.

------
gm3dmo
They've missed a trick: Make adding a comment only available to subscribers.

------
shusson
Stupid question: why doesn't reporting and flagging comments work?

------
rixrax
And while at it, add a forced moderation for the comments for the rest of the
videos where channel owner needs to approve of, and be responsible of for any
comments shown on the videos. And Facebook too.

------
benj111
I've watched numerous videos on YouTube, complaining about how YouTube treats
its creators.

How long is it going to be before we get comments turned off on videos
(channels?) for no identifiable reason?

------
jackcosgrove
Back when Facebook came out, everyone thought it was the best business model
ever. You provide a framework, and users provide the content.

Turns out user-generated content can be bad for business.

------
eumenides1
Am i on crazy pills, but aren't YouTube comments trash in general?

Isn't it better to have the creator link to a discussion page with actual
moderation and moderation tools?

------
calbear81
Here's a crazy idea, why not only let children comment on children videos?
This way, kids that want to still express themselves and connect to their
peers can do so on YT.

~~~
spookthesunset
Great. Tell me a way to determine which accounts are being used by kids in a
way that scales to hundreds of millions of active users...

------
Phenix88be
I don't know how to feel about this. The concept of "commenting on web page"
is old, but with so much people connected to internet, do we really need a
comment section on all pages ? From my point of view, comments are most of the
time irrelevant, especially on Youtube. HN would be the exception.

Comments involves a lots of energy to moderate for no value most of the time.
I know it's nice to have feedback when you create something and share it whit
the world, but maybe we don't need a comment system to have user feedback !

~~~
squirrelicus
My thoughts were along the same lines. Why exactly are comments allowed on
YouTube at all? Literally all humans are aware that YouTube comments are a
cesspool.

~~~
tux1968
Really depends on the channel. There are many technical channels that have
interesting and useful comments without many problem posts at all.

------
aero142
I watched the original expose video of this and I found the comments be
disturbing, but I don't expect any platform to perfectly prevent comments. The
more damning part for YouTube was how the recommendation engine was clustering
the videos based on pedo preferences and automatically creating little pedo
playlists. To me, that is their bigger fault as a platform is that they are
built on automatically giving people more of what they like. Even pedos.

------
ForHackernews
Here's hoping YouTube will see the wisdom in just removing the comments
feature.

------
moonbug
Just adblock everything but the player element. It's the only way to use
YouTube.

~~~
ShhhImAtWork
I like to use 'Reddit on YouTube' on Firefox, which shows relevant reddit
discussion in place of the YouTube comment section.

~~~
Spivak
Implying that Reddit comments are any less of a cesspool. Reddit hosts a lot
of quality discussion from small niche communities but the same is true for
YT.

If anyone can figure out how to maintain quality discussion at the scale of
thousands of speakers they'll be a billionaire overnight.

~~~
bduerst
Comment voting has been the major solution thus far. Any scaled or NLP
automated solution is just an arms race with bad actors.

~~~
moonbug
And it's no solution.

~~~
bduerst
I remember the days before Digg, Reddit, and Youtube voting. It's magnitudes
better than that garbage.

------
RIMR
Alright, cool. Now all the pedophiles can lurk in secrecy and collect all the
recordings and repost them on pedophile websites, and YouTube can pretend it
has nothing to do with it because none of the offending comments are under the
videos making it abundantly obvious who views these videos.

------
doorbellguy
YouTube algorithms. How effective are they at this sort of deal?

------
balls187
I wonder why Youtube has this problem, and not say Instagram?

------
dangerface
Good call they should remove them on all videos.

------
matz1
This the time for someone to make alternative platform that have different
policy.

~~~
hn_throwaway_99
Oh yes, I imagine that a video site whose raison d'être is "we allow comments
on videos of children" will be a real bastion of high-level discourse.

~~~
matz1
They don't have to stated it out blatantly.

Beside making other platform, what are other alternative though?

------
mruts
Or, you know, maybe the kids/parents could turn off the comments themselves?

~~~
bilbo0s
YT isn't really worried about the kids/parents.

They're worried about the Feds in the same way that craigslist was worried
about the Feds when they scrapped their "escorts" section. (And the same way
that Backpage _should have_ been worried about the Feds when they hosted their
"escorts" section.) If the Feds find that pedophilia stuff on your servers,
you better have a good lawyer. Which I assume YT does, and just like
craigslist, they have concluded that the nuclear option is the one that will
keep the maximum number of executives out of federal court.

My question is this, is there a better way? Is there a way to prevent these
undesirable comments other than just taking comments down for _everyone_?

I understand YT's position. They have to make money, and their executives
would just as soon not receive the Backpage treatment. That said, is there a
better way?

~~~
mruts
Fair enough. And I know companies need to act in the best interests of their
shareholders. But shouldn’t a backpage exist? We need companies to provide
services that the government (which is still stuck in the dark) doesn’t
approve of: prostitution, drugs, etc.

------
tanilama
Youtube comments tend to be cancer anyway. Won't miss them.

------
yason
The march to make the world a safe, sanitised place has so many bumps and
slippery slopes on the road that it's hard to gauge whether we're making
progress or not. Internet is hard to tame and so is human nature, both in good
and bad.

The trigger for this was some big companies pulling ads. I suppose that in the
context of "predatory and obscene" comments regarding children it's well
understandable. The problem with that is what happens next time when the
advertisers pull ads because there is something else they don't like?
Eventually, it leads down to political correctness and self-censorship as
there is always someone who gets offended, and even the possibility of getting
kicked out of Youtube or the service losing ad revenue keeps everyone on their
toes.

Further, disabling comments fails to address another issue: pictures and
videos of children freely available on the internet. That cat is out of the
bag for good. The technical response would be to have a separate (private or
public) paedophile forums where they just link to Youtube videos and make
comments on the videos. That's outside of Youtube's business and keeps BigCos
from having bad PR but would still matter to any underage person shown on an
internet video.

So the real enabler to this is the proliferation of user-published content.
Especially when people seem to like to share stuff online, inevitably also
with strangers, it is even conceptually pretty hard to limit the audience of
that sort of material. Fighting that process is like the war on drugs: it will
just claim casualties one after another, left and right, whereas nothing
really changes. While trying to make things nice and clean for everyone,
babies get thrown out with the bath water.

Dirt on internet is like a law of nature. There has always been dirt,
sometimes more and sometimes less. While paedophilia is the true hot potato
nobody really feels comfortable with, the _other_ dirt is a valuable cross-
section into the scale on which people are. Bad (Youtube) comments can be so
revealing at times, and they really pull you out of your bubble.

I don't like the general process of eliminating dirt and trying to set a
political correct and business-approved bar for quality because what remains
after that is shallow reflections of actual persons and human psyche.

I don't know what Youtube should do either. The concept of brand companies
disliking that their advertisements are shown next to offensive _third-party
comments_ is very much absurd in the first place, so how can you deal with
that?

I can understand that companies want to whitelist or blacklist specific
categories and channels where they want (or don't want) their ads but getting
pissed by comments complete unrelated to the advertisements is where things
get hard. That's like pulling ads from a magazine because the magazine
happened to have pictures of children and a group of molesters happened to sit
in a pub making predatory comments of those pictures next to the
advertisement. What has previously been unimaginable apparently turns into
reality when you do it on the internet.

------
twblalock
Social media scandals like this one are a reminder that there are larger
numbers of closet pedophiles, racists, Nazi sympathizers, and conspiracy
theorists in our societies than many people realized.

Social media gives them a means to discover each other and collaborate, but
they were always there.

I agree with Youtube’s move here, but the root cause of the problem is that
our societies are worse than we thought they were. Fixing that is going to
take more than policing comments on websites.

~~~
deelowe
Did you look into the scandal? It seemed pretty overblown to me.

~~~
DanBC
Paedophiles were making sexualised comments to children in Youtube comments.
Advertisers were unhappy. That doesn't seem particularly overblown.

------
goshx
To ban comments is an important step IMHO, but how about putting the people
making the comments behind bars? They will keep watching these videos.

~~~
ceejayoz
There's no law making creepy comments like these illegal, at least not in the
US. (I can't speak for other countries, and US law is what's relevant here.)

~~~
goshx
That doesn't mean they can't or shouldn't be investigated.

~~~
ceejayoz
I doubt a creepy comment is sufficient probable cause to open an
investigation, barring other indicators.

~~~
goshx
A google search has been sufficient before. Remember the pressure cooker and
backpacks search from a family in New York?

~~~
ceejayoz
A Google search was _not_ sufficient in that case. The guy's employer tipped
them off after firing him - it wasn't some sort of automated keyword
monitoring at Google.

[https://www.theatlantic.com/national/archive/2013/08/governm...](https://www.theatlantic.com/national/archive/2013/08/government-
knocking-doors-because-google-searches/312599/)

~~~
goshx
The search alone wasn't sufficient, but was enough to start it all. In
YouTube's cases, there are plenty of comments that demonstrate someone is a
sexual predator. In my view, if someone "tipped them off", it would be the
exact same scenario.

~~~
ceejayoz
"A random person on the internet is posting creepy comments" is likely not
probable cause, especially as you'd have to convince someone national like the
FBI to care without a specific location.

"Someone I know personally is posting creepy comments, and I've always gotten
a bad vibe about their interactions with kids" is probably enough to get
_local_ cops interested enough to knock on a door.

~~~
jjjensen90
I hope not. Unless there is evidence of a crime that has occurred or evidence
that a crime is planned, they have no business (constitutionally) bothering a
citizen based on that kind of tip. This is how lives are ruined when there are
no victims.

------
dalbasal
Youtube comments...

Ultimately, YT comments are fairly horrible. Apart from an occasionally decent
joke or even more occasional comment from the video's subject, youtube
comments are either nasty or banal... with a high % of nasty. Racism, nasty
politics, sexual harassment, other kinds of harassment... The median _top_
comment is a troll.

Of all the major social media, I can't think of a worse commenting culture.

Banning comments on children sounds like a decent decision from "the suits" to
curb the most harmful effects but... where are the product people?

Surely, someone, at some point, has had ideas for how to improve this travesty
of a discussion feature. It's not like you're risking much. Even if changes
kill commenting entirely...

How could google let this be so terrible for so long?

~~~
oaw-bct-ar-bamf
If you think YouTube comments are bad, I recommend you to visit the Twitch
(Gaming) stream chats.

It will change your perception of 'horrible comments'

------
forrestthewoods
YouTube would be better if they banned all comments on all platforms.

I use the HerpDerp plugin. It converts all comments to herps and derps. A+++++
would recommend.

[https://chrome.google.com/webstore/detail/herp-derp-for-
yout...](https://chrome.google.com/webstore/detail/herp-derp-for-
youtube/ioomnmgjblnnolpdgdhebainmfbipjoh?hl=en-US)

------
mortdeus
Theres something called a parental control and last i checked they were pretty
effective for concerned parents.

Google does not need to be in the business of trying to provide parental
oversight over creators who just so happen to be minors.

Plus dont we want to retain the ability to profile people who are showing a
concerning level of indication that they may be somebody who is willing to act
on an impulse to commit paedophilia.

IMHO i think the best course of action to deal with pedophiles on youtube who
think they are protected by the annomity of the internet, is to try and
develop a way that makes it possible for authorities to obtain enough solid
evidence that enables a federal prosecutor to bring up validatably just
charges against youtube users who knowingly attempt to proposition a minor
into an act of sex.

Also i need explicitly throw out an IANAL disclaimer right now and point out——
!!! I am NOT a lawyer nor a student of law !!!——before I delve into this more.

But theoretically speaking I think from a prosecutorial point of view (which
again i stress that IANAL) it shouldn't be hard to prove a logically sound
legal case that a sexually provactive comment that is targeted at a minor
subject in a video's public forum (regardless of how vague, indirect, or
seriously intended said comment may or may not be composed) this should all be
considered knowingly propositioning a minor.

