
YouTube AI deletes war crimes evidence as 'extremist material' - jacobr
http://www.middleeasteye.net/news/youtube-criticised-after-middle-east-video-taken-down-over-extremist-content-1244893230
======
Hasknewbie
Youtube's response regarding one of these videos documenting abuses (emphasis
mine):

> "we've determined that your video does violate our Community Guidelines and
> have upheld our original decision. _We appreciate your understanding_."

Can someone explain to me why corporations, when interacting with customers
regarding complaints/appeals, seem to have "don't forget to add insult to
injury" as one of their motto more often than not? Does that kind of
patronizing tone sound polite to the ears of a PR drone?

~~~
viridian
If I remember correctly, the exact same message was delivered to Jordan
Peterson a couple of weeks ago or so, before he sat down with the google memo
guy. He was in the middle of a bible lecture series, and Google banned his
account, and sent the exact same "we've determined that your video does
violate our Community Guidelines and have upheld our original decision. We
appreciate your understanding." message.

It seems tone deaf especially since in cases such as these there is no
understanding to appreciate. Google will not tell you what you did to violate
policy, only that they checked to ensure that they found you guilty, and then
they snub you further with the HR speak. It's maddening.

~~~
cvsh
The worst part of the information age is arbitration by unreasonable and
impenetrable algorithms rather than humans with the capacity to make a
judgement call when the rules clearly don't account for the situation at hand.

A transparent appeals process staffed by humans who can at least deliver a
rationale, including what rule you broke, should be required by law. There's
irreparable reputational damage associated with an algorithm libelously
labeling something "extremist content" that isn't.

~~~
viridian
I think the bigger issue is that certain companies have near monopolies in
their spaces to start with. For plebs like myself, youtube is really the only
viable option I have to distribute video media if I hope to build an audience.
The fact that you effectively can't mount an alternative to facebook, youtube,
etc due to network effects is the larger disease, and this is one of many
symptoms.

~~~
the8472
I think the only solution is a distributed and decentralized web.

Distributed hosting of static content is a sorta-solved problem. But curating,
linking and discoverability (which require mutating content) is a lot harder
due to the trust anchor problem.

~~~
stephen82
Your suggestion exists and has a name: bitchute. I will paste here what they
have for "About" at the end of their main page:

    
    
       BitChute is a peer to peer content sharing platform. 
       Our mission is to put people and free speech first. 
       It is free to join, create and upload your own content to share with others.
    

Feel free to read more about it in their FAQ. I really want to stop using
YouTube and use this instead.

I hope they make it.

[https://www.bitchute.com/](https://www.bitchute.com/)

~~~
vidarh
The big challenge with this is to solve the problem that almost everyone has a
"one step too far" when it comes to what type of content we are willing to
tolerate and/or what type of content we may get in trouble for hosting even if
it is not intentional.

That makes it tricky to for solutions that "put people and free speech first"
to succeed, because they've basically painted a giant target on themselves,
and it easily makes even a lot of people that sympathise in principle worried
about the bits and pieces that steps over _their_ personal line.

Figuring out a reasonable solution to this, I think, will be essential to get
more widespread adoptions of platforms like these.

~~~
the8472
I think the problem is the expectation of people that someone else do the
filtering for them. I.e. "I don't want to see this kind of content" leads to
"someone else should remove it from all the sites I visit". Which obviously
leads to conflicting requirements once you have more than one person and those
people disagree on what they want to see and don't want to see.

The only reasonable solution is to host _everything_ , modulo requirements by
law, and give users the tools to locally filter out content en masse.

In a decentralized system you also skip the law requirements since you cannot
enforce multiple incompatible jurisdictions at the platform level, individual
users will be responsible for enforcing it on their own nodes, similar how all
you can do when accidentally encountering child porn is to clear your cache.

~~~
vidarh
The problem on these distributed platforms is not filtering what people _see_
, but filtering what people _host_ or allow to _transit_ their network
connections.

> In a decentralized system you also skip the law requirements since you
> cannot enforce multiple incompatible jurisdictions at the platform level,
> individual users will be responsible for enforcing it on their own nodes,
> similar how all you can do when accidentally encountering child porn is to
> clear your cache.

But that's the thing: You don't skip it. You spread it to every user. They
both have to deal with whether or not they are willing to host the material
and whether or not it is even legal for them.

How many of us sympathise with the idea of running a Tor exit node, for
example, but avoid it because we're worried about the consequences?

These platforms will always struggle with this unless they provide ways for
people to feel secure that the content that is hosted on their machines is
content they don't find too offensive, and/or that traffic that transit their
networks is not content they find too offensive.

Consider e.g. darknet efforts like cjdns which are basically worthless because
their solution to this was to require people find "neighbours" they can
convince to let them connect. Which basically opens the door to campaigns to
have groups you disapprove of disconnected by harassing their neighbours and
their neighbours neighbours, just the same as you can go to network providers
on the "open" internet.

~~~
the8472
First of all, not all p2p networks operate like Tor. For example bittorrent
and ipfs only host content you look at. So hosters could largely self-select
the content they replicate.

Secondly, there are several tiers of content. a) stuff that is illegal to host
b) stuff that is not illegal but that you find so objectionable that you don't
even want to host it c) stuff that you don't like but doesn't bother you too
much d) stuff you actually want to look at. I posit that a) and b) are fairly
small fractions and the self-selection mechanism of "things that I looked at"
will reduce that fraction even further.

And even if you are on a network where you randomly host content you never
looked at encryption can provide you some peace of mind (of the obliviousness
kind) because you cannot possibly know or expected to know what content you're
hosting. Add onion routing and the person who hosts something can't even be
identified. If Viewer A requests something (blinded) through Relay B from
Hoster C then B cannot know what they're forwarding and C cannot know what
they're hosting. If neither you nor others can know what flows through or is
stored on your node it would be difficult to mount pressure against anyone to
disconnect.

For the illegal content, especially in oppressive environments, you could
install a Voluntary Compliance(tm) government blocklist on public-facing nodes
and still opt to run an internal node in your network that uses encrypted
connections to retrieve things hosted in other countries you're not supposed
to see.

\----

Anyway, back to filtering for decentralized content hosting. I think once you
have a _network_ it is a matter of managing expectations. You can't make
content magically disappear. Platforms like youtube, twitter, facebook etc.
have raised the false expectation that you can actually make things go away by
appealing to The Authority and it will be forever gone. In reality things
continue to exist, they just move into some more remote corners of the net.
Once expectations become more aligned with reality again and people know they
can only avoid looking at content but not make it non-existent things boil
down to being able to filter things out at the local level.

~~~
rmc
> _And even if you are on a network where you randomly host content you never
> looked at encryption can provide you some peace of mind ... If neither you
> nor others can know what flows through or is stored on your node it would be
> difficult to mount pressure against anyone to disconnect._

I think you misunderstand the objection. Yes, encryption can mean you cannot
be persecuted for "hosting"/"transmitting" some objectionable stuff, since you
can prove that you had no idea (at least that's the theory).

However some want to be able to "vote with their wallets" (well "vote with
their bandwidth"). They don't want to assist in the transmission of some
content, they want that content to be hard to find, and slow and unreliable.
They have the right to freedom of association and don't want to associate with
those groups. Encryption cannot guaranatee that I won't help transmit
$CONTENT.

------
jacobr
Also see this Twitter thread:
[https://twitter.com/EliotHiggins/status/896358097320636416](https://twitter.com/EliotHiggins/status/896358097320636416)

> Ironically, by deleting years old opposition channels YouTube is doing more
> damage to Syrian history than ISIS could ever hope to achieve

> Also gone are the dozens of playlists of videos from Syria I created,
> including dozens of chemical attacks in playlists by date

> Keep in mind in many cases these are the only copies of the videos, and in
> some the channel owner will have died, so nothing can stop it

~~~
giancarlostoro
That is a little insane... Makes me wish there was some sort of website that
archived specific YouTube videos marked as historical or criminal evidence or
some sort of qualification, as long as they don't abuse copyright just to make
sure if places like YouTube delete them they can remain. Or a service that
uploads to multiple video streaming sources at once (though I imagine these
might violate the TOS of YouTube for w/e reason).

Kind of sad where you have video evidence being deleted by YouTube. It would
be nice if they allowed some sort of option for political type videos like
these to actually be uploaded by users, especially if the original uploader
was killed, to be downloaded with metadata (upload date to youtube, youtuber
username, etc) so anyone can reupload it elsewhere.

Another case where I wish TPB had made their own YouTube clone already. I'm
sure they would of not taken down these sort of videos.

Wondering where Wikileaks is in these sort of cases? Do they download these
sort of videos? That begs the question: why don't they? It seems right up
their own alley. I don't always agree with them, nor do I digest their content
but at the very least for a site like theirs it would make sense for them to
archive YouTube and other politically sensitive videos no?

~~~
aiyodev
Not to pick on you because this seems to be a popular opinion but I would just
like to point out the insanity of your what you just wrote.

"as long as they don't abuse copyright"

Would we delete videos of the liberation of concentration camps if there was
Nickelback music playing in the background? This just demonstrates how
successful media companies have been in distorting the true purpose of
copyright laws: to promote science, art, and culture for the public's benefit.
It does not exist for fairness or personal gain. Copyright laws should be
changed to better reflect this. Nobody should be able to silence any
information that has a public benefit.

~~~
Spivak
Like, Nickelback is in the original footage or someone overalyed the track on
top of the footage? I would imagine if this ever really happened it would be
fine in the former, and muted in the latter.

You're forgetting that 'promote...' means give the creator control of that
content for the purpose of limiting access and making money. Promote in the
sense that it becomes possible to actually sell artistic works like
commodities. And _then_ once the value has ben extracted the public can do
what they please with it.

~~~
tripzilch
> Like, Nickelback is in the original footage or someone overalyed the track
> on top of the footage? I would imagine if this ever really happened it would
> be fine in the former, and muted in the latter.

While I agree that would be the most sensible course of action, if this in
reality happened _right now_ , the whole video would be deleted in both cases,
automatically (assuming it's clear enough to trigger detection in the first
example).

~~~
softawre
Nope, today they mute the audio.

------
AdmiralAsshat
It was folly to think that YouTube would be a safe place to document war
crimes. YouTube is a distribution channel, not a preservation channel. Its
ease of use certainly makes it an attractive option to upload things quickly,
but anything of historical significance should have the video raws immediately
turned over to a human rights organization for preservation.

~~~
ballenf
Youtube used to be a distribution channel but it slowly became an ad delivery
tool with content along for the ride. Like 99% of other free (and even some
paid) internet sites.

Either way, totally agree that it's a tragic situation.

The only sympathy I have for Google is that trying to separate the good vs.
"evil" (as in "Don't be evil.") content is a monumental task that machine
learning will probably never be capable of performing. So they're left with
the choice of spending an inordinate amount on human review and detailed
research or just make wildly over-broad removals.

I'd rather they leave up more rather than less, but they tried this approach
and it almost lost them every major advertiser. So continuing down that road
would potentially lead to the whole platform losing viability. Maybe some
would like that outcome but these historical videos would be just as lost.

Maybe we'll see the pendulum swing back in an effort to reach a more
reasonable middle ground.

~~~
nxc18
Google has an inordinate amount of money to spend on inordinate amount of
content review. Deleting history and evidence of war crimes is Evil; good
thing Google had abandoned their earlier aspirations.

~~~
jptman
I think you are underestimating the amount of video that gets uploaded to
Youtube. Smaller sites may be better at stuff like this but that's solely
because they don't have quite the amount of content. Hundreds of hours of
videos are uploaded per minute. They may eventually have AI good enough to do
a better job, but this is an unprecedented amount of content to review.

~~~
CaptSpify
So?

If youtube can't handle the load, then they shouldn't claim that they can. At
the absolute lest, they need a usable appeals process. If they can't do that,
then they need to own up to it, and stop allowing anyone to upload anything.

~~~
koide
Sorry, but why? Google owns nobody anything. If you have war crime evidence or
other important content to publish, upload it to YouTube and all other video
sites, letting archive.org and other human rights organizations in the loop.
You can put the press in the mix as well.

~~~
CaptSpify
Because it's a shitty thing to do, and we shouldn't encourage shitty behavior?
Just because they want to take the cheap and lazy route doesn't mean that we
can't criticize them for it.

------
cisanti
I have (had) a channel that had videos about missing people, their last
sightings on CCTV etc. The parents of a missing person even used an embed
video on their site of a CCTV footage. They emailed me if I still have the
video because they need it.

YouTube banned the whole channel for extremist/hateful content. Probably some
of the videos/titles told the AI that the footage is extreme or some sort of
glorification.

I appealed on some form but don't even bother anymore.

I hope YouTube as a video platform (not streaming) gets a serious competitor.

------
Iv
During the Arab Springs I suspected many police violence video would be
deleted from Youtube. I had downloaded them to my server and posted everywhere
the links for people to mirror them. Not a single person did yet.

I have been amazed at the little importance people put on this kind of video.
You have video evidence of crimes with faces appearing clearly. It can take 5
to 10 years for such events to calm down enough to reach a point where crimes
can be prosecuted.

And it is hard to blame youtube for that. They are considered the channel for
Lady Gaga and silly cats video. Hell, I know 3 years old toddler who browse
youtube unsupervised.

In many places Youtube is criticized to promote violence and extremism by
leaving these videos. I feel bad for them, they are between a hammer and a
hard place.

I just hope that the censored video are not totally deleted from their
servers. They should have someone reviewing criminal videos and keeping them
at the disposal of judicial authorities but even that opens a whole can of
worms: do you obey only to US authorities (who do not care about war crimes in
other countries)? Do you obey all world authorities including Saudi and
Chinese?

Anyway, that's youtube's problem, not ours. Simply, helping prosecute war
crime is not part of Youtube's mission, so do not trust them for it. To anyone
who feels this is important content, use youtube-dl and keep backups. Make
torrents of it, share it around, make sure it does not disappear.

And when some NGO finally realize that this content is precious, pump up your
upload bandwidth and fill their servers.

~~~
flamedoge
Youtube isn't the only channel to upload videos. Check out webtorrent or
bitchute for p2p based videos.

------
mnm1
Such AI coupled with the inflexible policies of companies like Google and
Amazon is already starting to be a problem and will only get worse as it's
deployed more broadly. Accounts are closed without recourse for invalid
reasons and their owners treated like violators. Short of a law requiring
explanations and an appeal process, I don't see this situation getting better
ever. Yet another reason not to trust these companies or use their services
that require creating accounts and agreeing to their bullshit TOS.

~~~
tenpies
> Such AI coupled with the inflexible policies of companies like Google and
> Amazon

I would argue that there is a key difference in customer support which makes
me much more confident in Amazon than Google.

Google has non-existent customer support for the public and virtually non-
existent customer support for paid customers. If something goes wrong with
your Google product your best bet - even as a paid customer - is to contact
someone you know at Google. Going through the official channels is a waste of
time.

Amazon, on the other hand, will bend over backwards to make sure you're
satisfied - even if it loses money in that transaction. Refund decisions are
mostly automated at this point, although human support for both vendors and
buyers is there if needed.

~~~
mnm1
I specifically listed Amazon because they close accounts with no appeals and
no reason. Just because you've had a good experience, doesn't mean that's
universally so.

------
nnq
Maybe people should get their shit together and realize that _true free
speech_ include _allowing_ videos that seek to recruit people into despicable
organizations _be available!_ Yeah, even Hitler had _a right to say what he
thought_ and it'a a _good thing_ he had it, _despite the consequences that
ensued._

The _problem that needs to be solved_ is how to _educate people into not being
lured into those organization DESPITE having access to those materials..._
This kind of censorship is just as STUPID as banning drugs like heroin and
cocaine (instead of just making them unavailable to children, or without a
"license") or the "war on drugs".

Imho the problem comes from the fact that corporations try to hard to be
"democratic" about things and "please the majority". But this is not a good
idea: sometimes the majority of 99% is against freedom, and _they are wrong_ ,
despite being the 99%. And the majority should be opposed and _freedom
protected_ even when the cost is someone's blood. For me personally, there are
these words from my native country's national anthem: _" life in freedom or
death [for all]"_... and I will sure as hell fight, die or kill for them.

~~~
Quanttek
Agree to disagree. This isn't really an issue of a lack of _absolute_ free
speech but instead has much more to do with AI gone bad. With any reasonable
definition of free speech (i.e. limited) it would've been wrong to remove
footage documenting war crimes - hell even with a highly restricted free
speech.

Freedom of speech is not inherenlty the highest value there possibly is. You
should be able to defend something like "Yeah, even Hitler had a right to say
what he thought and it'a a good thing he had it, despite the consequences that
ensued." because, in my opinion, I rather restrict the freedom of one
genocidal maniac than see the dath of 85 mio. people.

Using your definition of freedom of speech we could easily justify not
outlawing murder: "We should just educate people not to murder each other
instead of banning it." Maybe banning can actually have a chilling effect on
ie. hate-speech, heroin abuse? (While I'm for banning heroin, I advocate for
providing services that ensure safe consumption (e.g. needle dispensaries,
consumption rooms) and help prevent (further) abuse instead of jailing them)

~~~
nnq
> any reasonable definition of free speech (i.e. limited)

First, "limited free speech" is not "free speech" anymore. Second, you're just
not going to be able to "define" things anymore as you automate and replace
with AIs more and more processes (the definition will more am more start to be
"the practical implementation of the machine learning filtering algorithm and
the choice of training data", anything else will be "approximations" since you
won't be able to prove things about these statistical algorithms much). The
choice will be either (a) full freedom + massive investment in mechanisms to
manage the negative consequences of this freedom (let's start with education
first, not only making it free for all at all levels, but also forcing people
be given free paid time to educate themselves, not worked to dumbness
8+hrs/day and then expecting them to differentiate real-news from fake-
news...) or (b) give up freedom and live in a "well managed totalitarian
system" with "freedom for distractions and sex only" or some other deal like
that.

> we could easily justify not outlawing murder

No, there'l a clear criteria: _reversibility!_ If I _say something incredibly
hate enticing,_ I can be proven wrong, and I can even retract my words and say
"I changed my mind" later, that should be ok. If I _murder someone,_ that
can't be undone, even if I say I changed my mind about murdering him, he's
still dead and I've still _proven_ that I'm capable of murder (imho _all_
people are, but that's a different discussion...).

------
alexandercrohde
To me, if you want to regulate controversial opinions, you have to err
strongly to the side of too-open.

Remember, before the declaration of independence our founding fathers were
terrorists/rebels. I don't mean this as a snappy hollow comparison. I'm saying
fundamentally, you can't distinguish between a US soldier recruitment video
and an ISIS soldier recruitment video without applying a moral context. How
would an AI ever do this? And even if it could, who's moral retelling is the
right one?

Better in my mind to stay out of the censorship game altogether and promote a
forum that is inherently structure in a format that incentivizes accuracy over
emotions.

~~~
LoSboccacc
>US soldier recruitment video and an ISIS soldier recruitment video

Somehow I doubt US recruitment videos have englishmen being decapitated as job
perk

------
013a
YouTube is balking at their own size. They're discovering what should have
been obvious to anyone; the sheer amount of content entering their centralized
system is impossible to moderate in any fair way. The only way they can manage
is (A) prioritize quality moderation toward channels which are more popular,
and (B) enforce the most bland, vanilla experience possible.

They need to moderate because they are centralized, and their revenue demands
it. We, as a society, need to create a better option. Not just another
YouTube, but a seamless decentralized solution.

~~~
LoSboccacc
They need to metamoderate: let people tag video by content (nudity, violence,
crime, death...) and as soon a video missing a tag gets flagged close the
channel.

User that enable viewing of certain tags csn't complain then, and google only
needs to put enough legalese when enabling comtent

They already doing that to an extent with mature content, so there's that

------
AmIFirstToThink
Why not create a setting that allows user to see YouTube as sanitized by their
AI or all content?

Allow people to chose content level just like they choose security level in
browser settings.

1\. Legal content. May include content that violates YouTube content policy,
but is legal in USA, or the country of the viewer. Maximum freedom of speech
and maximum ability to see content that you may find offending.

2\. YouTube content policy met. Content that is legal and meets YouTube
Content Policy.

3\. Legal, Meets YouTube content policy, Meets a certain org's taste. Like
when you can pick a charity that you can donate to when you shop on
smile.amazon.com. You can select the org whose bubble you want to live in.
ADL, Focus on Family, Skeptics etc. The org bans content and it only is banned
for people who opt into that blacklist on youtube.

4\. When user is not logged in they get AI filtered list but can select "all
legal" or "all that meets content policy" filters, even when logged out. All
others bubbles available to logged in users only.

Advertisers can opt into certain bubble if they want, or opt out of certain
content e.g. content deemed inappropriate by the AI?

How does that sound YouTube?

Doesn't the government security agencies want to know who is watching
extremist content and who is not interested in it? How would we know who the
extremist are if they fall back to person to person, in person, communication?

~~~
colejohnson66
Because YouTube doesn’t care about the videos; it cares about the advertisers.
You can’t be a proponent of free speech (extremist propaganda) while trying to
please advertisers. Also, with today’s political climate, people seem like
they want anything that disagrees with them to be labeled as hate speech.

~~~
AmIFirstToThink
The advertisers would find it suitable to chase a bubble e.g. greens,
nationalists, globalist, feminist, religions etc.

I think a good strong case can be made to advertisers that their ad will only
be served to people opting into a certain bubble. Or reverse of that i.e. show
my Ads to all people except those who are in this bubble. Inclusion list and
exclusion list.

~~~
fastball
That only works if all your "bubbles" are palatable bubbles.

What advertiser is going to want to advertise in front of ISIS/Neo-Nazi
videos?

~~~
AmIFirstToThink
Why, the store that sells Nazi paraphernalia on Amazon, of course. People who
sell NAZI books.

Jokes aside, why does it matter that Advertisers are not choosing certain
'unpalatable' bubbles to not be associated with those channels, that's
perfectly fine.

You seem to be concerned about extremist videos making money, I don't care
about that at all. I just want all videos to be available unless legal system
demands its removal after due process. Present sanitized content by default,
present all content if explicitly requested, where an action from user says
they want to see offending content.

Advertisers should be allowed to chose the channels that they advertise on.
Some may choose to advertise on default channel before the content is flagged.
Let the buyer make the decision. Why is YouTube giving in to certain vocal Ad
buyers to decide for entire Ad market?

~~~
fastball
1\. Every video YouTube hosts on its platform costs YouTube money.

2\. YouTube recoups the cost and makes a tidy profit from ad revenue.

3\. Videos you can't put ads in front of can't be monetized.

4\. Videos that can't be monetized still _cost YouTube_ to process, host and
serve.

5\. As such, every non-monetizable video hurts YouTube's bottom line.

6\. Why would YouTube want that?

~~~
AmIFirstToThink
YouTube decided that it will demonetize videos. They preferred to do that for
some complains by some Advertisers, rather than giving Advertisers tools to
avoid/select certain channels or certain topics. Very similar concept to
AdWords, BanWords! Put the Advertiser in control of where the Ad gets shown or
specifically, where it doesn't get shown. Demonetization was an ideological
decision by Google, it was a political decision.

>As such, every non-monetizable video hurts YouTube's bottom line.

Even every non-monetized video is still eyes on the screen for Google. NetFlix
claims to compete with books & libraries for time of the day from its viewers.
Yeah, Google may not show Ad on that video but the next one in
autoplay/recommended list is still going to ring the cash register. It is
actually better for Google because they may very well show Ads but not pay the
De-Monetized content creators, videos that they created actually act as leads
to youtube. These political videos is where a reader is sent to youtube from
non-youtube sites, which is much more valuable to google and they are getting
away by not paying anything for that. Leading video is much more valuable to
google than a video in auto-play list. Not pay for leads but get paid for Ads
on subsequent videos, very nice business model win-win Google developed for
itself.

------
userbinator
Yes, I could see how that classifies as "extremist material", but that's no
reason to delete them...

IMHO the gradual increase of (self-)censorship in the popular Internet is
worrying --- one of the most compelling things about the Internet as it
existed was that, from the safety of your own home, you could see and
experience things that would otherwise be impossible to access. Now it seems
it's turned into a massively commercialised effort of "curating" content so
that it doesn't offend anyone, and only results in more profits for
advertisers.

~~~
cat199
Old internet is still there, you just have to not be too lazy to host your own
content..

------
Alex3917
Since my understanding is that covering up a war crime is itself a war crime
under Complicity doctrine, could Google executives get charged for this in The
Hague?

~~~
ocdtrekkie
I am not an expert in international law, but you would still have to prove
intent, I believe. It's hard to prove intent on an algorithm that very simply
is incapable of understanding the significance of its actions.

~~~
7373737373
Deferring decisions to an algorithm doesn't absolve the owner from
responsibility for its actions. If the consequences are unknown, why should it
be allowed to use it?

~~~
PeterisP
For things that are crimes only if the intent is there, it certainly _does_
absolve the owner from responsibility. There's no duty to preserve everything
that might be evidence in some manner (because, really, _everything_ might be)
- if evidence gets destroyed as a byproduct of normal operations, that's not
prohibited.

------
mtgx
I remember when I used to like - no, _love_ \- almost anything Google did.

That seems like such a long time ago. Since then my attitude has changed to
being mostly _hostile_ towards Google, with every such event.

Google should have never entered the "content game" and should have remained a
neutral search and distribution (YouTube) platform. Once it went down the path
of being a content company, it started "compromising" in all sorts of ways
that were terrible for its users.

I wonder if the higher-ups have even noticed this change in attitude towards
them, and if they did, then they've probably decided that making money is more
important even if they become the Comcast of the internet (most hated
company).

------
monocasa
Have they checked with YouTube to see if the files are actually deleted?

Like just because their gateway won't give you access to it doesn't
necessarily mean that the bits have been scrubbed on the back end.

Also: here's a project to archive this information.

[https://media.ccc.de/v/33c3-7909-syrian_archive](https://media.ccc.de/v/33c3-7909-syrian_archive)

~~~
mustacheemperor
Unfortunately many of the original uploaders have since died in the war, and
the deleted playlists were the only visible place the videos were accessible.

~~~
monocasa
Sure, but I'm saying that in a 'documentation of war crimes' context YouTube
might allow a little peek behind the veil if the videos aren't deleted but
hidden.

~~~
mustacheemperor
Agreed. I think the larger problem highlighted here is that in this case
"YouTube" is just the faceless algorithm making these decisions, and accessing
the judgement of a real human being is nearly impossible for the average end
user.

~~~
PeterisP
They quite likely have some humans verify the content - however, the criteria
most likely are quite straightforward (depictions of the deaths violate their
guidelines).

They don't provide the judgement because that only invites attempts of
explaining and negotiation - they don't want to spend time to do a careful
review of every contested video, they want to make an usually accurate final
judgement with the minimum time investment possible (e.g. 5 seconds for a
video), and they don't want to spend resources reading all kinds of reasoning
and appeals, so they don't. And that is their right to do so - they can
completely arbitrarily choose which videos to host on their site and which
not.

~~~
mustacheemperor
Sure, but returning to the original point - by that token, why should YouTube
have any obligation or even judicial ability to retain videos like this on
"the back end?"

It is indeed YouTube's right to vaporize any bits at any time. But when they
are the leading video platform on the entire web by a huge margin they need to
at least adequately present the reality of their content guidelines to users
in countries like Syria who are probably not focused on researching
alternative video hosts while trying to document chemical weapons attacks.

------
mschuster91
Once again, the only hope for customer service seems to be a (social) media
shitstorm.

Seriously, Google, Twitter and FB massively need to ramp up their customer
service and not externalize the costs of a lack of support onto society any
more. And there are many "costs": people being actively harrassed and
intimidated, sometimes so far they are afraid leaving their house, due to hate
speech or doxxing, a loss of historically relevant information as in this
case, people locked out of vital emails or their businesses (e.g. when their
Gmail account gets closed due to copyright violations on Youtube)...

~~~
TuringTest
> Google, Twitter and FB massively need to ramp up their customer service

This is not going to happen; the whole point of their businesses based on
offering free massive online services is that they are dirt-cheap by being run
mostly automatically.

No, the only way to fix the problem in those juggernauts, and protect the tiny
individuals from getting caught and squashed in their wheels, is the mechanism
that governments use to protect citizens from the worst effects of
bureaucracy: having an ombudsman. A semi-independent service to receive
complaints of severe abuse by the main service, and for which the primary goal
is protecting users, not reducing costs.

In some sense, this is how their PR department operates: they'll bring human
attention and put all the required effort to fix an unjust situation, to clean
the image of the company. The difference is that now the unjust situation
needs to become a scandal, as you said, and an ombudsman would be required to
examine all applications (either to accept them or reject them) as part of
their official definition.

~~~
mschuster91
> No, the only way to fix the problem in those juggernauts, and protect the
> tiny individuals from getting caught and squashed in their wheels, is the
> mechanism that governments use to protect citizens from the worst effects of
> bureaucracy: having an ombudsman. A semi-independent service to receive
> complaints of severe abuse by the main service, and for which the primary
> goal is protecting users, not reducing costs.

Yeah but who would finance the ombudsman? To service a country like Germany,
I'd bet it needs around 2.000 FTE minimum (Facebook alone is opening a new,
additional 500 FTE centre right now, and that's just for deleting the worst of
the worst hate speech and porn). That's around 5M € _per month_.

Having it paid for by taxpayers is the true manifestation of cost
externalization, having it paid for by the services quickly leads down to "do
whatever $company wants", and having it paid for by users leads to service
only for those who can afford it while leaving the poor and most vulnerable
persons in the rain.

~~~
TuringTest
Maybe a non-profit foundation could assume the task, financed by Google's PR
money and wealthy patrons, and having some sort of community representation to
take care of the needs of users.

Modern society is looking a lot like time-compressed feudal eras, with
corporations taking the role of noble families; imho the time of powerful
independent bourgeois professionals, thriving under the rule of law in nation
states, is coming to an end. Maybe we should start looking at the medieval
ways of organizing a fair society, at least as the starting points for the new
social structures that will be unique to the digital era.

------
tempodox
Mis-applying bad so-called “artificial intelligence” is still a prime example
of natural stupidity.

------
brndnmtthws
If you use YouTube, you are subject to the whims of that private corporation,
regardless of whether it's right or wrong.

They should find a way to host the content somewhere else.

~~~
emilsedgh
Well almost _everything_ we use these days belongs to a private organization.
And we're fully subject to their whims.

 _Something_ is wrong about that.

~~~
crusso
And yet, without those private organizations and the free markets that allowed
them to thrive - the whole notion of having all that _is_ preserved online by
them would be utterly unimaginable. The internet as we know it, capable of
passing all this bandwidth to individuals around the world wouldn't exist. At
best, it would still be a toy in government labs and academia like it was
before it was commercialized.

~~~
emilsedgh
I'm not against privatizing internet. I'm sure it wouldn't have been this
useful if it wasn't for private companies.

I'm just saying the notion that private companies (on the internet or not)
have almost zero responsibility and we're subject to their whim is wrong.

I guess that's where the rule of law is supposed to enter the discussion.

------
raverbashing
Why are people storing evidence on Youtube again?

Not blaming the victim, but at this point most of Google services have not
shown to be reliable, especially if you require some kind of thinking human
behind a decision

~~~
gambiting
If you are a rebel in a contested part of the world, submitting videos to
youtube takes literally 2 clicks on any cheapest android phone, and then the
entire world can watch it. Everything else requires at least a bit of
technical expertise which you might not have.

~~~
raverbashing
Correct.

You could also save it to Google Drive or other "Cloud Backup" solutions like
OneDrive/Dropbox

But I guess hindsight is 20/20, and I would probably have trusted YT more than
I should

~~~
gambiting
If you submit a video to google drive and start giving people links then it
will be very promptly disabled with a warning that if you want to share it you
need to upload it to youtube. Same with OneDrive/Dropbox. It's fine to share
with a few people, but go into hundreds and it gets quickly shut down.

~~~
raverbashing
Yes, don't share it from those, but you can upload to both

------
Anagmate
I feel like YouTube uses its monopoly to create a walled garden focused on (in
their own words) advertiser-friendly content.

The thing is, it makes perfect sense from their side - they will make people
angry, but why would they bother if those people can't go anywhere else?

I'm starting to feel that a competitor providing the same quality of service
while allowing all kinds of videos has a chance to succeed. It's OK to have
both child videos, porn and Syrian documentation, as long as you can filter -
maybe have some sort of a "curiosity" slider that filters child content on one
side, YouTube content in the middle and all content to the other side. Also
some category toggles,... If you're unhappy with the current selection, just
take a few minutes of your time and change your preferences.

------
anotherbrownguy
Given that all of the videos happen to be anti-ISIS... and YouTube happens to
be owned by an evil empire in bed with American military industry which
created ISIS... the AI must have figured out that the videos could be a threat
to its masters.

------
AmIFirstToThink
What did they train the AI on to deem something 'extremist'?

Should we get to see the training data used and labels?

Or is this the modern day equivalent of credit score algo, something that can
have huge impact on lives, but you are not allowed to know what it is.

This is bad.

~~~
nthcolumn
We will have to get used to this - people hiding behind their AI's skirts
saying 'Wasn't me - she did it'.

~~~
AmIFirstToThink
They are still liable. They unleashed it upon the world.

Can't wait till they are sued out of their bubble.

------
wyager
YouTube is a really horrible service for content creators. For this type of
content, you're practically probably best off with LiveLeak (which,
incidentally, seems to be a much better source of breaking news than YouTube
these days). Ideally, we'd all switch to LBRY or some sort of IPFS video
distribution or something, but that will take time.

~~~
tpallarino
Yeah wow, an audience of billions having instant access to your content at the
click of a button. So terrible for content creators. Most of these content
creators wouldn't even exist without Youtube, they'd be working in a cubicle
somewhere forwarding memos.

------
dickbasedregex
Screw YouTube's automation across the board. It's horrendous and lazy.

~~~
mtgx
Yeah, forget about beating Go and StarCraft 2 top players. How about making
the takedown of YouTube videos actually fair for a change?

~~~
ocdtrekkie
Games are still 'easy' in comparison to this sort of topic for AI. Bear in
mind, every game they've had an AI play still has a firm set of rules.

------
cyanexttuesday
Google is seemingly more and more a regular almost evil corporation.

I miss the days of "don't be evil".

------
carvalho
War crime evidence can also be extremist material. It is often repackaged as
propaganda to rile up new troops.

Give evidence to the courts or police. Don't upload it to a video
entertainment site and expect it to stay up, despite skirting their rules.

------
greyman
As I understand it, this is the result of Google itself having quite a strong
political opinions, at least recently. They profiled themselves as being
leftist/progressive... their software just enforces this.

------
bedros
very related to this article about facebook [0]

corporations control what info passed to people, and create their own version
of reality, but blocking what they don't agree with.

I know it's AI, but seems that google appeal agrees with AI decision.

people should read Noam Chomsky's Manufacturing consent book, here's interview
about it in 1992 [1]

[0]
[https://news.ycombinator.com/item?id=14998081](https://news.ycombinator.com/item?id=14998081)

[1]
[https://www.youtube.com/watch?v=AnrBQEAM3rE](https://www.youtube.com/watch?v=AnrBQEAM3rE)

------
snakeanus
It seems that we really need to find a new distributed/decentralised
censorship-resistant way to distribute videos.

------
ajarmst
YouTube does not seem to me to be an appropriate medium for "war crimes
evidence". Evidence needs documented provenance, chain-of-custody, storage
integrity, affidavits, etc etc. Why does this evidence need a high-bandwidth
publicly accessible and searchable interface? For what purpose?

To be honest, if you have evidence of a war crime, I hope your plan to seek
justice doesn't depend on Youtube.

------
ajb
Douglas adam's 'Peril sensitive sunglasses' are nearly here.

------
floatingatoll
In case it's not already apparent, there's a business opportunity here for
someone to automate "set up an S3 bucket and host videos in it" as an app that
uses an API key, so that you simply provide the key to the app and it manages
your video collection, gives you a UX to it, and charges you a fee per month.

------
tetromino_
Often there is no difference between war crime evidence and war crime
glorification that machine learning could discern. Exactly the same content
could be interpreted as "look at us do great things in defense of our noble
ideals!" and "look at these monsters do horrific things for no justifiable
reason!".

The difference is in the audience's mindset - which is only partially
influenced by the uploader's intentions, and partially by how other pages and
channels link to the video and present it, and partially by historical context
(the same content can acquire a different interpretation five years down the
road). Machine learning cannot be expected to emulate that.

------
dandare
I am very concerned about Google using AI to filter hoaxes from search
results. Government testing syphilis on black population or selling drugs to
fund terrorism? That must clearly be a hoax, right?

------
redthrowaway
One of the most interesting developments in AI will be watching how we respond
to human rationality detached from human morality. Programs that optimize for
practical outcomes are going to come up with a whole host of solutions that we
consider abhorrent, not least because the mere notion that that solution is a
practical one riles our sensibilities.

------
chinathrow
The revolution will not be televised.

------
bryanrasmussen
I find this interesting in comparison with the google api that detects toxic
comments. I suppose we'll be seeing the same sort of situation in comments
sections (less irritating though)

------
TheRealPomax
To be fair, YouTube is under no obligation to some greater good; it's just a
video hosting service. Expecting it to "preserve footage" and any footage at
that, is a strange expectation.

~~~
cisanti
Not an obligation, but their mission statement is "To organize the world’s
information and make it universally accessible and useful."

Guess they need to change to "information that we and our advertisers agree
with."

Yes, I know they are different companies under Alphabet but it doesn't matter.
Google has become a monster and too big, powerful.

~~~
camus2
> but their mission statement is "To organize the world’s information and make
> it universally accessible and useful."

It's just marketing. If you really want to see what the actual "mission" is
read the TOS. Google,Facebook,Twitter and co like to boast about their
humanitarian and humanist stance, the Apple way, when it comes to their
relationship with their users but that's all a lie. The moment the need of
their users doesn't match their financial interest all bets are off. The shit-
storm triggered by a few outraged online publications and announcers a few
month ago is a demonstration of that fact.

Independence and freedom of speech online has a price and "users" are going to
find it out the hardway when Google refuses to host their content for
political reasons.

People already forgot, that the tech to share content online already exists,
it's called RSS and Google,Twitter,Facebook and co want it to go away.

------
StreamBright
Torrent based Youtube alternative when? I think the technology is ready to
move all of the content to a distributed system where it cannot be censored.

------
mirimir
So does YouTube want to look like an ISIS supporter? Or at least, that it
doesn't approve of criticizing ISIS?

------
ekianjo
Is it possible to host such videos on archive.org ? is that a valid option?

------
devpalmari
hope YT did a soft-delete on those files...

------
AmIFirstToThink
And, come to think they had me convinced that this was not going to happen for
few decades.

I think YouTube went down pretty fast and without fight. The ideological
takeover of Facebook and Twitter raged on for few years. I think YouTube was
taken over literally overnight. I remember being appreciative of YouTube just
a few days back.

Guess, time to cancel my $15 Youtube Red Family membership. Ugh, I really hate
ads on YouTube. And I was happy to give my $15 month over month. But, I can't
fund Youtube anymore given what they are doing. $15 to Youtube, $10 to
NetFlix, $10 to Amazon, with $35 a month, I can sponsor ton of content on
Patreon that I like. My subscription list on YouTube is not 35 people long, I
think it would work out.

Never ever I thought I would type these words... break up Google and Facebook
and Amazon.

~~~
pmoriarty
_" I really hate ads on YouTube."_

Just use youtube-dl and you'll never see any ads.

~~~
arca_vorago
May I also suggest mpsyt(mps-youtube). GPL cli YouTube interface that will
open your player of choice. I hardly use the web interface anymore.

I wish goblin would take off.

~~~
detaro
Do you mean [https://mediagoblin.org/](https://mediagoblin.org/), or something
else?

~~~
arca_vorago
Yep, I meant mediagoblin.

------
crusso
_should be required by law_

If your videos don't pass the algorithm, post them somewhere else rather than
reaching for the government hammer.

Youtube/Google has every right to run their business of posting or denying
video content the way they see fit without justifying it to you, free user of
their service.

If you think they're making a bad business decision and that there's a need
for a video service that gives great explanations when they deny your videos,
start such a service.

~~~
AmIFirstToThink
And, Government has right to break up monopolies.

I now fully support breaking up Google, Amazon, Facebook and Microsoft; may be
Apple.

Executives at publicly traded companies have no right to enforce their
individual political views as company policies. They are not privately held
companies, they are public companies who are held at higher standards of equal
treatment to all.

What if Bic and Mead said you can't write a opinion that we don't like using
our pen and notebooks?

What if US Postal Service said you can't send a snail mail if it contains
references to UPS or FedEx?

~~~
ComodoHacker
>I now fully support breaking up Google, Amazon, Facebook and Microsoft; may
be Apple.

OTOH, only their shitloads of money allow them to resist government demands to
hack a user's phone or hand over users' data in their cloud. Every cloud has a
silver lining.

~~~
greenyoda
Microsoft, Google, Facebook and Apple were all participants in the NSA's PRISM
surveillance program that Edward Snowden disclosed:

[https://en.wikipedia.org/wiki/PRISM_(surveillance_program)#M...](https://en.wikipedia.org/wiki/PRISM_\(surveillance_program\)#Media_disclosure_of_PRISM)

------
immanuel_huel
This was to be expected. All history books are written this way. History books
are government propaganda. History books do not document the truth. History
lessons are nothing but propaganda. So history at school is nothing but
learning government propaganda.

------
davidreiss
Thank you WSJ, NYTimes and the traditional media for pressuring youtube,
facebook, reddit and social media to censor.

People aren't aware that for the past few years, traditional media and social
media has been battling behind the scenes over content, narrative and
censorship. It was a major war going on that the public was simply unaware of.
Suffice it to say, traditional media won.

It is amazing how a select group of news organizations and their editors and
journalists can use their bully pulpit to intimidate certain industries.

------
pottersbasilisk
Perhaps its time for google and youtube to be regulated.

~~~
carapace
"Nationalize Google!" Okay, no, that would be turning the knob to 11.

I upvoted you because I was like, "Yeah, maybe some regulation might be good."
Then I thought about who would be doing the regulating and now I'm much less
enthusiastic. :-/

Still, it does seem like there should be a bit more, uh, community input into
how these vast silos are administered.

~~~
mythrwy
The community making it's own silo(s) might be another option.

(Problem is, silo making by committee has it's own set of challenges.)

~~~
carapace
Well, I agree, but then how do you get Joe and Jane User to switch?

------
mozumder
Can't any prosecutor gain access to those videos via subpoena anyways?

~~~
megous
What are you talking about? They are deleting entire accounts of people who
were filming their cities/districts being bombed by Russians, Assad or
US/Coalition. (where there may not be any direct violence to any particular
persons visible)

Who will prosecute whom? Primary source historical material is being removed
wholesale by some shitty "AI". Account of recent history is being erased.
Researchers who want to put some account of history together will have harder
job. They will not go to subpoena google to release some material they don't
even know google has. People whose lives were destroyed by a dictator will see
YouTube erasing evidence of what happened, often times leaving propaganda
channels for the regime untouched. It's a disgrace.

I actually came here today to try again to post about this issue, after
deleting my Google/YouTube account, because I wholeheartedly disagree with
this whole fiasco, that's going on for the last month or so. So I'm glad it's
finally discussed.

------
solarkraft
YouTube is not a reliable video host, but that's okay. It's a company.
Fortunately these videos don't really rely on people finding them by having
them recommended by an algorithm as they are merely evidence. I don't see a
problem and completely understand why YouTube (especially as it's getting as
non-offensive as it can) doesn't want to show war crimes.

