
YouTube admits 'wrong call' over deletion of Syrian war crime videos - jacobr
http://www.middleeasteye.net/news/youtube-admits-wrong-call-over-deletion-syrian-war-crime-videos-1140126078
======
alexandercrohde
I think youtube needs to consider backing off regulating political content.

The fact is politics and morality are inherently intermingled. One can use
words like extremist, but sometimes the extremists are the "correct" ones
(like our founding fathers who orchestrated a revolution). How could any
system consistently categorize "appropriate" videos without making moral
judgements?

~~~
clarkmoody
Interesting that we have a whole vocabulary of words reserved for those on the
"wrong side": insurrection, sedition, traitor, deserter, criminal, smuggler,
terrorist, extremist, rebel, revolutionary, etc.

But from the "right side" of an ideology, those are the freedom fighters,
visionaries, defenders, Founding Fathers, Underground Railroad, idealists,
etc.

Controlling vocabulary is a very powerful tactic in politics, as illustrated
in _1984_. People respond in very predictable ways to certain words, hence
their power.

~~~
3131s
Why doesn't Youtube remove this "extremist" content I wonder...

[https://www.youtube.com/watch?v=BWsnYxJ1ghA](https://www.youtube.com/watch?v=BWsnYxJ1ghA)

~~~
throwaway0255
It's surprising to me how sanitized that footage feels. You watch a hundred
people get killed, but it's not really striking a nerve the way a single
beheading can.

That's probably why footage of the American military usually doesn't get taken
down. Our military kills from a thermal view half a mile away.

~~~
Retric
US military spends a lot of time and effort on propaganda. Lines like "bring
our boys home" are literally inserted into hundreds of movies. They also
release a lot of this type of footage to portray combat as less horrific than
it actually is.

------
itaris
I'm much a proponent of automation as anyone else. But I think right now
Google is trying to do something way too hard. By looking for "extremist"
material, they are basically trying to determine the intention of a video. How
can you expect an AI to do that?

~~~
jmcdiesel
I dont think the problem is automation...

It's people's expectation for it to be perfect, and the egoic drive to blame
someone when something goes wrong. There was no reason for the hype around
this story... an AI determinator had a false positive. Thats not google
attacking the videos, thats a technical issue and it needs to have zero
feelings involved because the entire process happened in a damned computer
incapable of feelings...

But everyone needs to feed their outrage porn addiction...

~~~
colordrops
It's not a technical issue. Software is not yet capable of accurate content
detection, and even if it were, it's not clear whether this sort of thing
should be automated. It's not like google can just change a few lines of code
and the problem is gone.

~~~
jmcdiesel
The point is, there will be false positives, there is no reason to get upset
and hurt over them...

There is no perfect system. If its automated, there will be false positives
(and negatives), if there is a human involved, you have a clear bias issue, if
there is a group of humans involved, you have societal bias to deal with...

There is no perfect system for something like this, to the best answer is to
use something like this, that gets it right most of the time... then clean up
when it makes a mistake. And you shouldn't have to apologize for the false
positive, people need to put on their big boy pants and stop pretending to be
the victim when there is no victim to begin with...

~~~
saurik
This is the exact same argument for "stop and frisk", and that is just totally
NOT OK.

~~~
jmcdiesel
Thats exactly the OPPOSITE of stop and frisk.

1) Stop and frisk is BIASED heavily on race becuase its a HUMAN making the
choice...

2) Stop and Frisk is the GOVERNMENT, and therefore actually pushes up against
the constitution.

How do you see these things are remotely the same?

~~~
saurik
The false positives are not random: they target minorities; these automated
algorithms designed to filter hate have also been filtering people trying to
talk about the hate they experience on a daily basis. They keep people from
even talking about events they are attending, such as Dykes on Bikes. It is
NOT OK to tell these people to "put on their big boy pants" and put up with
their daily dose of bullshit from the establishment.

[https://www.washingtonpost.com/business/economy/for-
facebook...](https://www.washingtonpost.com/business/economy/for-facebook-
erasing-hate-speech-proves-a-daunting-
challenge/2017/07/31/922d9bc6-6e3b-11e7-9c15-177740635e83_story.html)

[https://www.lgbtqnation.com/2017/07/facebook-censoring-
lesbi...](https://www.lgbtqnation.com/2017/07/facebook-censoring-lesbians-
using-word-dyke-posts/)

------
molszanski
Let's look at the bigger picture. First, in March some newspapers find an
extremist video. It has ~14 views and YT advertising all over it. They make a
a big deal out of it. As a result YouTube looses ad clients and tons of money.

Then, as a response, they make an alg. They don't want people to call them a
"terrorist platform" ever again. Hence they take down the videos.

Now, this algorithm is hurting the bystanders. IMO the real problem is a
public and business reaction to the initial event.

And this peace of news is an inevitable consequence.

~~~
chii
if you look at this sequence of events, the deeper trouble is really that of
the newspapers (or any other media outlet). Their incentive is to find things
people are going to get shocked by, and report it to get views/clicks (that
fuel advertising). They aren't going to fully explore the pros/cons, like long
form journalism, since that's expensive, and they get paid peanuts from
advertisers.

The public's reaction is going to be the one of outrage, because the news
article was _designed_ to evoke such feelings! If the public sentiment was
more liberal, then the news article would've picked an even more extreme
event, and evoke the same reaction.

Fix the problem at its root - that of advertising funded news and media.

~~~
molszanski
Couldn't agree more. In my opinion, advertising is not the _only_ problem.
Remember, YT is a competitor to the "old media". And hurting them is in their
best interest. Hope that initiatives like wikitribune will pave the way to a
better future

------
RandVal30142
Something people need to keep in mind when parsing this story is that many of
the effected channels were not about militancy, they were local media outlets.
Local outlets that only gained historical note due to what they documented as
it was unfolding.

In Syria outlets like Sham News Network have posted thousands upon thousands
of clips. Everything from stories on civilian infrastructure under war, spots
on mental health, live broadcasts of demonstrations.

Everything.

Including documenting attacks as they happen and after they have happened.
Some of the effected accounts were ones that documented the regime's early
chemical weapons attacks. These videos are literally cited in investigations.

All that is needed to get thousands upon thousands of hours of documentation
going back half a decade deleted is three strikes.

Liveleak is not a good host for such outlets because it is not what these
media outlets are about. Liveleak themselves delete content as well so even if
the outlets fit the community it would not be a 'fix.'

------
jimmy2020
i really don't know how to describe my feeling as a syrian when i know the
most important evidence that witnessed the regime crimes were deleted because
of wrong call. And it's really confusing how artificial algorithm get confused
between what is is obvious as isis propaganda and a family buried under the
rubble and this statement makes things even worse. mistakenly? because there
is so many videos? just imagine that may happen to any celebs channel. Will
youtube issue the same statement? dont think so.

------
ezoe
What I don't like about those web giant services is, to get a human support,
it requires to start social pressure like this.

If they fucked up something by automation, contacting to human support is
hopeless unless you have very influential SNS status or something.

------
tdurden
Google/YouTube needs to admit defeat in this area and stop trying to censor,
they are doing more harm than good.

~~~
Buge
Just admit defeat and let all the advertisers pull out?[1] Then go bankrupt
and (like Soundcloud was on the verge of) let everything get deleted?

[1] [https://www.theguardian.com/technology/2017/mar/25/google-
yo...](https://www.theguardian.com/technology/2017/mar/25/google-youtube-
advertising-extremist-content-att-verizon)

~~~
CaptSpify
Maybe the advertisement-funded content needs to be pulled into question then

------
balozi
Well, the AI did such a bang–up job sorting out the mess in comment section
that it got promoted to sorting out the videos themselves.

~~~
charlesism
5 minutes reading YouTube comments is enough to make me ill. I don't know what
a couple hours a day after school would do to a person after ten years. We'll
all find out, I guess, once this generation of kids reaches adulthood.

~~~
dilemma
Probably has the same effect as computer games did on us - little.

~~~
charlesism
Statements like that always puzzle me. How would you be able to tell?

For example, it is entirely possible that they called it right, back in the
1950s, when they said Rock and Roll would ruin society. We have no access to
what would have happened without it, and so nothing with which to compare .

~~~
charlesism
..should stress I have nothing against Elvis :) It's just useful as a thought
experiment.

------
osteele
HN discussion of deletion event:
[https://news.ycombinator.com/item?id=14998429](https://news.ycombinator.com/item?id=14998429)

------
DINKDINK
What about all the speech that's censored that doesn't have enough interest or
political clout to make people aware of the injustice of its censoring.

------
williamle8300
Google (parent company of YouTube) already sees itself as the protector of the
public's eyes and ears. They might be contrite now but they behave as a
censorshipping organization.

------
norea-armozel
I think YouTube really needs to hire more humans to review flagging of videos
rather than leave it to a loose set of algorithms and swarming behavior of
viewers. They assume wrongly that anyone who flags a video is honest. They
should always assume the opposite and err on the side of caution. And this
should also apply to any Content ID flagging. It should be the obligation of
accusers to present evidence before taking content down.

~~~
schoen
> They assume wrongly that anyone who flags a video is honest.

I don't think they assume that at all. If they did, you'd see at least an
order of magnitude more videos removed.

I agree with the sentiment of your criticism, but I think we could phrase it
more in terms of prior probabilities or something about the false positive and
false negative rate in their review process. Flagging of videos is _extremely_
common and even a small amount of unreliability in the review process
translates into a huge number of mistakes.

Also, users of the site don't actually agree with each other much at all about
which removals were in error; we could say that there's absolutely abysmal
inter-rater reliability if the end-users of the site are the "raters" of the
quality of content removal decisions.

Also, most people who flag things don't necessarily know much at all about
YouTube's terms of service or how YouTube has interpreted or applied them in
the past, so it's hard to be clear on what it means for flaggers to be honest
or dishonest. Probably the most common meaning of flagging is "ugh, I'm upset
that this video is up on YouTube".

~~~
norea-armozel
The biggest problem, imo, isn't the random flagger but rather the concerted
actions of groups to flag videos. This is obvious in terms of reddit or 4chan
users swarming a channel they don't like. This kind of behavior needs to be
mitigated in some way. I think a quick solution would be to force a cool down
timer on flagging of 24-48 hours for all users to ensure they're not abusing
the system. That should include random users who file DCMA takedowns that
aren't partnered with Youtube in some way.

~~~
ue_
Very much agreed on this swarm behaviour. A political channel made by a very
kind person with nice intentions didn't seem to breaking any rules at all,
though the /pol/ board on 8chan coordinated mass-flagging attacks against his
videos _twice_ which resulted in his channel being deleted twice.

------
pgnas
YouTube (google) has become the EXACT opposite of what they said they were not
going to do.

They are evil.

~~~
zeep
At least, someone at Google was good (honest) enough to drop that motto...

------
miklax
Bellingcat account should be removed, I agree on that with YT.

------
762236
Automation is the only real solution. These types of conversations seem to
always overlook how normal people don't want to watch such videos. Do you want
to spend your day watching this stuff to grade them?

~~~
EpicEng
Yet youtube is admitting that the videos should not have been pulled, and
there's no AI in the world that could have made the right call here. So...
what sort of automation are you suggesting? It seems as though the real
solution is the exact opposite of what you're proposing; human review by
better trained personnel with clearly defined criteria.

~~~
762236
This is irresponsible to the trained personnel. We have job safety
requirements for people working on assembly lines. We should also have
psychological safety for people, and there are lots of stories of employees
that are paid to view the toxic videos suffering (even developing PTSD).

~~~
EpicEng
So we shouldn't allow jobs which may expose employees to psychologically
harmful experiences? You realize you'd have to outlaw a whole slew of
professions, right? If you can't take it then don't work the job, Yeesh.

