
Nestle, Disney Pull YouTube Ads, Joining Furor Over Child Videos - nopriorarrests
https://www.bloomberg.com/news/articles/2019-02-20/disney-pulls-youtube-ads-amid-concerns-over-child-video-voyeurs
======
hombre_fatal
>On Sunday, Matt Watson, a video blogger, posted a 20-minute clip detailing
how comments on YouTube were used to identify certain videos in which young
girls were in activities that could be construed as sexually suggestive, such
as posing in front of a mirror and doing gymnastics.

I looked at some of the videos that appeared in his report and it's basically
family videos of young girls, the same kind of videos my own sisters would
shoot with our dad's camcorder. His first example are videos that come up when
you search
[https://www.youtube.com/results?search_query=orbeez+bath](https://www.youtube.com/results?search_query=orbeez+bath).

I'm sure these videos do attract some weirdos in the comments, and in the next
paragraph Youtube says that they want to moderate that activity, but what else
do you do here? Ban kids uploading a video of their innocent pool party
because some creeps might enjoy it?

The youtuber's video includes the provocative phrase "Sexual Exploitation of
Children", but I only saw harmless videos of kids having fun when I looked at
his own examples. He also focuses a lot on the comments themselves which is a
much different argument.

~~~
baddox
> it's basically family videos of young girls, the same kind of videos my own
> sisters would shoot with our dad's camcorder.

Your family camcorder videos probably weren't made available for checkout at
the public library, filed under "young girls doing gymnastics" in the
library's card catalog. If they were, I imagine your family and many members
of the public would be creeped out or upset.

> but what else do you do here?

Require (or maybe automatically force, if implemented well) those videos to be
set to Visiblity: Unlisted. I don't think anyone would be upset with sharing
these links with family.

~~~
acl2149
Not sure that analogy makes sense. YouTube makes it pretty clear you are
publishing your video to the public when you upload the file. You actually
have to confirm the video is going to be public. It's not like YouTube is
raiding the family archives for the content

~~~
ipsum2
> YouTube makes it pretty clear you are publishing your video to the public
> when you upload the file

After working in tech for long enough, you realize that no one reads anything,
even if its absolutely clear in the UI. And they will blame you for it.

~~~
tptacek
As well they should, because (the hypothetical) you built and deployed
something that relied for safety on users doing something you knew they
wouldn't do.

~~~
taneq
So in your world, we should just never give users nice things because we know
they're stupid?

~~~
Asooka
No, you make the default safe and make the "dangerous" option hidden behind an
advanced menu, so at the very least you can be certain the user can read and
follow directions. Like a two-stage weapon switch.

------
robbrown451
This just seems ridiculous. I post a lot of videos of my daughter to share
with my family and friends, and generally don't care who looks at them any
more than I care who looks at her (or what they are thinking) when we are in
public. I've gotten a couple creepy comments here and there from random people
who found them. (the comments weren't overtly sexual though, but I deleted
them anyway) Here's an example of a harmless one that got such comments (note
that my daughter is the one in the jeans, I tried to be careful when shooting
it regarding the girl in the skirt):
[https://www.youtube.com/watch?v=1p21NMw54Y0](https://www.youtube.com/watch?v=1p21NMw54Y0)

I also notice that video has 10 times the number of views as the rest of the
videos of her, presumably because of the title.

I had a couple videos from when she was younger and wearing diapers and a
shirt when she was wading in the bay, and got a comment from someone who
wanted to know what brand of diapers she was wearing. (wtf?) I deleted the
comment of course, but a while later I see that YouTube has deleted the video
for "inappropriate content" (double wtf). I mean, she was more covered up than
if she was wearing a two piece bathing suit.

The thing is, even if someone is a perv and gets off on this sort of thing, it
isn't endangering her. I'm not going to lose sleep over it. If I am really
that worried about protecting her from pedophiles, I probably shouldn't take
her anywhere in public, where someone could follow us home or otherwise
directly cause her harm. I'm not going to live my life in that kind of fear.

~~~
kelnos
While certainly child predators are an issue, I don't even _need_ to consider
this sort of thing from that angle.

Why do you need to post publicly rather than marking it as unlisted and
sharing the link with people you know and trust? Would you personally like to
have a video posted of you in your underwear? Even if that's okay with you,
perhaps your daughter will grow up preferring to have a more private life, and
to keep things like that off the internet? But you've taken that choice away
from her.

One of my favorite articles on this topic:
[https://slate.com/technology/2013/09/facebook-privacy-and-
ki...](https://slate.com/technology/2013/09/facebook-privacy-and-kids-dont-
post-photos-of-your-kids-online.html) (and a follow-up, after the author was
informed that she had unwittingly allowed some photos of her daughter out onto
the internet: [https://slate.com/technology/2013/09/privacy-facebook-
kids-d...](https://slate.com/technology/2013/09/privacy-facebook-kids-dont-
post-photos-of-your-kids-on-social-media.html)). I don't know that I'd go as
far as she has, creating a sort of "digital account trust", but I do like the
approach of keeping a kid's private life off the internet until they're old
enough to decide how much they want to share.

~~~
robbrown451
As far as underwear goes, no I wouldn't care if I was under age 3 in the
photo. Seeing a kid in a diaper is an everyday public thing. In that case, she
was in a diaper rather than a bathing suit because she was still diaper age. I
don't see it remotely as something to be embarrassed about.

I don't "need" to post publicly but I also have no reason not to. I'm not
going to let people having creepy thoughts control my life, unless there is an
actual, tangible danger of harm.

Honestly, if I was worried about creeps, I wouldn't let her wear a skirt in
public. I see other girls at the playground all the time that make me wince a
bit...I'm far more likely to dress my daughter in pants. But at the end of the
day, I'm 1000 times more concerned about people in real life than on
YouTube....because they are right there.

~~~
Waterluvian
Have you considered that your child might really regret all this stuff you're
publishing about them when they're older?

I don't exactly care myself but I do appreciate that my kid isn't capable of
consenting to having his childhood published. So I don't do it.

It seems really odd that you're reason for making that call is, "meh, why
not?"

~~~
Noumenon72
It's more like having considered it, I have no idea why they would care. A
picture of me as a kid is essentially a picture of some other person who is
five years old. When I see a picture of some five-year-old kid I don't think
"Wow, they might feel terrible if they knew I could see this."

Maybe the sleepy baby girl from
[https://www.youtube.com/watch?v=KTCQpjUrCe8&index=147&list=L...](https://www.youtube.com/watch?v=KTCQpjUrCe8&index=147&list=LL3R4zsTvaSCjzcV59_EiJtQ)
will grow up to be really shy and feel that wasn't her best angle. That
strikes me as a personal hangup, not some constitutional right we should all
be protecting her from.

It's funny how heated I can get about the notion "there are people whose
reaction _isn't_ 'meh, why not?' What's wrong with them?" I'm like a militant
meh-why-notter.

~~~
robbrown451
Militant meh-why-notter. Can I use that? :)

(thanks for your support, btw...)

------
14
Youtube wants to cater to everyone but to me it is clear Youtube is a risky
place for a child. I let my kids use the site but I have no expectations of
anything less then we are seeing here and try to keep an eye from behind. But
I myself have found these types of wormholes he talked about in the video.
Searching for massage techniques and saw a pretty girl so clicked it then next
thing I know I am down a worm hole of naked women being massaged on Youtube.
Another worm hole is the one pointed out by another youtuber about
breastfeeding videos. You search for breastfeeding and you will see these
creepy videos pretending to be educational but clearly are just a video of a
woman showing her breasts and a child feeding. Then came across videos that
seem to be educational but really it was just a large breasted woman standing
naked and the video described all her lady parts with close up shots. So to me
clearly youtube has all sorts of crazy things from how to have sex to how to
build weapons. I am not kidding myself that it is a kid hostile environment. I
do find that they have do many good videos for kids so allow my kids on the
platform. I just do my best to keep an eye what is going on.

~~~
komali2
What I don't get is why the fringe community even exists - is there that large
of a market of people with porn websites blocked, but youtube not blocked?

~~~
Loughla
People at work?

Other than that, honestly, the only population I can come up with who would be
watching these on youtube instead of an actual porn site would be kids.

~~~
Mirioron
What would stop kids from going to a porn site anyway? The question whether
you're 18+?

------
skissane
My wife and I have lots of photos and videos of our kids on the Internet. Not
on YouTube, but on Facebook and Instagram yes.

I think most of them, only our "friends" can see them. But social media
friends includes some random person I sat with once on a plane 5+ years ago
and have never spoken to since, and people my wife used to play Farmville
with, and someone one of us went to high school with and haven't spoken to in
twenty years, and so on.

Is it possible that some of these people who see photos/videos of our kids are
getting some sort of creepy pleasure out of it? Probably not, but you never
know.

But, if they are, should I care? I mean, if one of our kids walks down the
street, and some creep glances at them briefly and gets some sick pleasure out
of it, but leaves it all in their head – there is no way we could ever know,
and as disturbing as it is to think about, it isn't harming anyone anyway.

Now, if they take it out of their heads, and start acting on it – harassment,
stalking, grooming, abduction, etc. – very different story. But, that's pretty
unlikely to actually happen. And most random people on the Internet are far
away physically, and so little risk of being able to do anything like that, at
least in non-virtual forms. The main protection against the virtual forms of
those things, is really monitoring of children's Internet usage, and educating
them–I doubt the availability of photos/videos makes much difference.

------
skybrian
It's not the same, but I'm reminded of the Habitat Chronicles [1]:

"I found myself unable to reconcile the idea of a virtual world, where kids
would run around, play with objects, and chat with each other without someone
saying or doing something that might upset another. Even in 1996, we knew that
text-filters are no good at solving this kind of problem, so I asked for a
clarification: 'I’m confused. What standard should we use to decide if a
message would be a problem for Disney?'

"The response was one I will never forget: 'Disney’s standard is quite clear:
No kid will be harassed, even if they don’t know they are being harassed.'"

So now it's 2019, and here we are. I guess the filters have gotten better. We
certainly expect a lot more from them.

[1] [http://habitatchronicles.com/2007/03/the-untold-history-
of-t...](http://habitatchronicles.com/2007/03/the-untold-history-of-toontowns-
speedchat-or-blockchattm-from-disney-finally-arrives/)

------
post_break
YouTube only seems to do something when it's a controversy and Ad dollars are
on the line. Putting out fire after fire I don't know how the CEO keeps her
job. Daddyofive, weird children ASMR, elsa-gate, algorithm being 2 clicks away
from weird obscure kid videos.

~~~
xxpor
>YouTube only seems to do something when it's a controversy and Ad dollars are
on the line

"Humans respond to incentives" isn't really surprising IMO.

~~~
CyberDildonics
You can hire firefighters, I would rather hire people who know how to build a
house that doesn't burn down.

~~~
bduerst
I don't get this analogy. Are you hiring firefighters to build your house? Are
the ad agencies the fire?

~~~
jshevek
He's just saying that prevention is better than responding/reacting after the
fact.

------
cronix
Just get rid of comments, which are mostly useless crap anyway. Or force the
content uploader to manually approve individual comments on their videos
before they can be made public and hold the uploader responsible for
everything shown.

The content is innocent (from what I've seen).

Might as well ban catalogs and all underwear/swimsuit ads. They show young
girls (and boys) in underwear, swimsuits, leotards, etc. Some deviant
somewhere will use that for purposes other than it was intended. Might as well
get rid of it all, everywhere, because _someone, somewhere_ will misuse it.

~~~
djsumdog
They'd probably have to get rid of playlists too. Like the other comment said,
a lot of these are just innocent videos, many of which kids make themselves.
Either that or they're videos from sports games or dance competitions.

The videos themselves are pretty normal. It's the context that the playlists
place them in that's at issue.

------
movedx
An eleven year old shouldn't be uploading content to YouTube. An eleven year
old doesn't understand the second or third order consequences of their actions
when they upload a video of themselves, along with their friends, trying on
bikinis in their bedrooms.

If you're under 16, your content, and the uploading of it, should be
supervised. Why are we letting our children run free on the Internet? You
wouldn't let them run free on a highway and the Internet is just as dangerous
but in different ways.

I believe YouTube need to implement a system of age verification for content
creators. 16+ minimum if you ask me. If the parents do the uploading, whelp,
you found your issue (if any arise)

~~~
brianpgordon
My 10-15 year old self would hate you. Most of what I even remember about
being that age from my involvement in various online communities. I didn't
have many friends at school so I talked to people in games and chat instead.
To suggest that my parents should have to be there to "supervise" me is
absurd. To some extent they tried to do this, and I resented it and I'm
certain that it did no good at all. Not least because sanitizing access to the
internet is impossible and if I wanted to get past a restriction of any kind I
found a way.

Your kids talk to each other at school without parents going over their
conversations to make sure everything is kosher. Interacting online isn't any
different, as long as they know not to give out their physical location
publicly or to strangers. Even more so these days, kids are digital natives.
Cutting them off from the network is like giving them a lobotomy. They'll be
well behaved, obedient, sessile outcasts.

Anyway, interacting online is so little like running free on the highway that
it's difficult to even find a foothold to frame a counter-argument. It's a non
sequitur.

~~~
movedx
> Interacting online isn't any different

Except a school is a (mostly) sealed environment. The Internet is not. Anyone
can join the conversation online. Not just anyone can walk into a school
classroom and pretend to be a teenager to social engineer a young girl.

> Cutting them off from the network is like giving them a lobotomy.

I didn't say cut them off.

~~~
tfha
If I didn't have independence on the internet starting the age of 14, I
wouldn't be who I am today. I did many things I would not have been
comfortable doing under supervision, and many of those contributed greatly to
my long term personal and professional growth.

------
srkmno
If I understand this correctly the videos in question are family videos
uploaded and monetized by parents, and the issue is that perverts are
commenting on them, right?

Invariably a small percentage of people are going to suck, it's the way of the
world, what is YouTube or anyone to do about it? Are they now responsible for
everyone who watches a publicly available video?

This brand of hysteria is the product of alarmism, and sensationalism
regarding everything social media, it's gotten ridiculous when we blame theses
platforms for all of humanity's failings and newspapers are advocating for
censorship.

------
alanlamm
I am surprised at how few of the comments here say - if parents and/or kids
want to make non-sexual videos public, fine. If paedophiles are aroused by
them nometheless, fine too (as long as they do so privately and don't harrass
the kids). I don't know why one would presume to have a 'right not to incite
arousal' in another. Personally, I'd choose paedophiles enjoying videos of my
children over censorship by large american corporations any day.

------
vibrato
At some point, social media companies need to realize that they ARE utilities.
They should not attempt to control their content, rather, law enforcement
should prosecute illegal usage.

~~~
brandonjm
It's more difficult for law enforcement to prosecute illegal usage of social
media than other 'utilities'. Given the international nature of most social
media, illegal usage could happen anywhere on Earth and no one but local law
enforcement would have no power to do anything. If someone started stealing
electricity without paying or using it for some illegal purpose it would be
much easier to prosecute them as they would likely be within the same
jurisdiction as the utility operates.

------
jtolds
I'm surprised that the comments here focus on any specific video and don't
focus on the fact that YouTube starts recommending the _next_ similar video.

YouTube has convinced me that recommendation systems are garbage in general.
Try seeing what YouTube recommends to play after a few plays of searching for
something historic like the moon landing.

We can't crowd source finding related videos and then shrug like this. This
would not be nearly as big of a deal if YouTube's recommendation system was
dropped entirely.

~~~
dredmorbius
YouTube recommendations are the best posible advertisement for mps-youtube.

(I'm presently listening to a specifically curated queue of 8+ hours of
lectures of an author of recent interest, via Termux on Android, which I can
background, unlike YouTube Web or App clients.)

------
jaimex2
Yay. Apocalypse 2.0 on its way. Looking forward to:

1\. Any videos with footage of kids in them de-monetised. 2\. Comments
disabled by default on any sub 100k subscriber channels 3\. Mandatory ID
verification to be allowed to comment.

~~~
dawnerd
I think that'd be a good thing, honestly. Predatory would be scared away if
they have to attach some form of id to their digital identity. I mentioned
before, it could be as simple as validating a credit card.

------
toss1
YouTube is an amazing library, but you must absolutely know what you're
looking for.

The search algorithms have been cleaned up a bit in the last year, e.g., a
search of "is the earth round" now pulls up a majority of actual astronomy
vids debunking flat earth, but still serious junk in the top 10. But a year or
two ago, it was 80%+ junk. Similar for Chemtrails. So they're making progress
on the fake news.

But the autoplay and recommendations just run off the rails in a hurry. These
should just be shut down until they can put enough intelligence behind them to
actually keep making sense (i.e., a very long time).

But unlikely to happen, since autoplay generates more eyeball-seconds, and the
costs of bad wormholes are external to Alphabet . . .

------
sigmaris
Instead of hiring enough humans to review the content they make money from,
Google's approach to catching this stuff seems to be doing a global search for
videos with "CP" in related text and auto-banning the channels:
[https://www.bbc.co.uk/news/technology-47278362](https://www.bbc.co.uk/news/technology-47278362)
and then claiming that they're making use of "artificial intelligence" to
moderate the platform.

~~~
freeflight
I'd be really interested in how many humans you'd consider "enough humans" to
actually be able to make an impact?

Because it's quite easy to underestimate just how much content is constantly
uploaded to YouTube, how much of it is watched and reported, on a per second
basis [0].

With numbers like that, you could probably fully employ a small nation of
people and still wouldn't be able to review and catch everything. That is the
main reason why they try to automate all their solutions, they need to work at
a massive scale, manual reviews simply can't do that.

[0] [http://www.everysecond.io/youtube](http://www.everysecond.io/youtube)

~~~
brandonjm
Also they will not involve humans because it removes their ability to blame
the system. If a human employee fails to correctly flag something (which is
likely given the sheer volume of incoming content to review) and it slips
through, YouTube could potentially be held liable for its effects (whatever
they may be). Whereas if the system misses something it's easier to pass it
off as just a bug that needs fixing or an obscure edge case that wasn't
handled.

~~~
freeflight
I doubt that plays any actual role because as long as it's properly outsourced
nobody can claim it was one of their employees who censored something. Which
is exactly what happens in the Philippines [0]:

> There are two ways the content is forwarded to the Philippines. The first is
> a pre-filter, an algorithm, a machine that can analyze the shape of, say, a
> sexual organ, or the color of blood or certain skin color. So whenever the
> pre-filter is analyzing and it picks up on something that is inappropriate,
> the machine will send that content to the Philippines and the content
> moderators will double check if the machine was right. The second route is
> when the user flags the content as being inappropriate.

Afaik Facebook, Twitter, and Google all participate in this kind of
"moderation outsourcing" but the only way this is even possible is if they
have AI/ML pre-select content for moderation.

[0] [https://www.vice.com/en_us/article/ywe7gb/the-companies-
clea...](https://www.vice.com/en_us/article/ywe7gb/the-companies-cleaning-the-
deepest-darkest-parts-of-social-media)

------
protomyth
Do big advertisers have a way to pick what creators / channels they advertise
with? It seems like they don't give the number of times they've completed
pulled out.

------
khazhou
How effective is YT Kids mobile app at filtering out this junk?

------
jumelles
Thought this would be able Elsagate videos. That's another problem that needs
fixing - badly.

------
make3
Really seems ridiculous.....

------
porpoisely
Are they going to pull ads from the Oscars which celebrate an actual child
rapist like Roman Polanski?

Also, Nestle peddles sugary drinks to children and Disney peddles toxic movies
to children. Since when does either company care about children?

I guess this is a win-win for children. Youtube cracks down on pedophiles and
Nestle/Disney stops peddling their toxic product to children.

Edit: Also, Nestle and Disney have children in their ads. Are they going to
stop exploiting children and using their ads to lure pedophiles?

