
Facebook failed to block 20% of uploaded New Zealand shooter videos - sahin-boydas
https://techcrunch.com/2019/03/17/facebook-new-zealand/
======
zed88
I think censorship is not going to fix the issue, people have the right to see
and face evil if they so wish to and have a strong heart.

I am from NZ and a muslim and have watched the video myself an hour after the
incident, I know people personally affected by this and I think we all need to
come together as human beings and understand that there are some evil ideas
and people everywhere around the world.

And that's the challenge we face that wouldn't go away because of censorship
and putting our heads in the sand.

~~~
TheTruth1234
>> I think censorship is not going to fix the issue, people have the right to
see and face evil if they so wish to and have a strong heart.

It's the inadvertent exposure that's more of a problem. The access is just so
easy. If you were browsing at the time it was happening, you almost couldn't
not watch the video because it was so prevelant.

I'm imagining a 16 year old coming across it, and wondering how they process
it.

I made a conscious choice/effort not to watch it and 99% succeeded. But there
are moments I couldn't miss because they were everywhere.

By the way, that's not burying my head in the sand. It's more about respecting
the dead and not needing to see it to understand the darkness in some people.

~~~
gojomo
I'm curious, on what sites/services did you find the video so prevalent it was
hard to avoid?

~~~
TheTruth1234
If you need to ask me that ... I congratulate you on your restrained use of
social media.

~~~
gruez
Are you saying that the video was being posted to social media as "shock
videos" (ie. the viewer did not know that it was violent before watching it)?
If so, why is censorship needed, rather than a NSFW/NSFL label?

~~~
TheTruth1234
I'm saying the video was readily available as "news".

I don't care how it's dealt with, but I prefer it not to be a part of my
regular online diet.

~~~
gojomo
So, how does having it be "available as 'news'", behind an intentional click
or two for those who feel they want or even "need" to know, affect _your_
online diet?

Can't you just pass it over without clicking, like with the hundreds/thousands
of other offered-but-unwanted links one encounters in daily web/app usage?

~~~
TheTruth1234
"Intentional" ....

It was beneath the top trending hashtag on twitter. Initially, it was
absolutely not obvious what the video was that kept appearing. Sometimes it
would auto play.

Obviously rapidly it became clear what was going on ... and then it was a
basic game to escape ... and then to just switch off social media.

Honestly ... there are some f@cking weird questions asked about this subject
on HN.

Do you mind if I ask your age and general location? I'm really curious.

------
huffmsa
The video of the 757s hitting the WTC isn't purged from the internet, why
should this video be?

The world is dark and full of terrors (but getting better every day), and
sticking our heads in the sand doesn't change that.

~~~
tikwidd
The WTC footage is documentation of a terrorist attack by innocent bystanders.
The Christchurch Mosque shooting footage was created by the terrorist with the
aim of becoming viral. It is a component of the terrorist attack itself.

~~~
tenpies
If no one had recorded anything, but a few months later Al-Qaeda released
ground-level videos of the attacks happening, would we ban them _then_?

~~~
bhl
Facebook already bans ISIS propaganda videos, so it's a small leap to assume
that they'll also ban Al-Qaeda released videos as well.

------
x38iq84n
The shooting in Christchurch is a tragedy that should not have happened.

That said, this massive wave of censorship across MSM, NZ ISPs, regardless of
the motivation, is very disturbing. It will be inevitably used to suppress
citizens' freedoms, it will cause more tension in society that will escalate
into more tragical incidents. Also, the video may serve someone to study modus
operandi of such an attacker to increase their chances of survival in a
similar situation. Either way, the censorship and attempts at rewriting
history (removal of Spacey, Jackson), again banning inconvenient books are
extremely dangerous and slippery slope to get on.

~~~
telesilla
Can I ask you: how can I, who does not at all want to see any graphic images,
avoid these without censorship? It's not fair to allow these things in the
public field where we might see disturbing things without wanting to. Videos
and images must be tagged, and therefore given a rating so that I can self-
censor.

~~~
rndgermandude
Frankly, not my problem. That's your problem.

Today it is you who says we gotta censor what you find offensive, tomorrow
President Orange-Nimrod will censor all of CNN because it's OK now to censor
things that somebody/anybody finds offensive, and in a few decades time
President Jenna Bush Hager will censor all depictions of an Earth that isn't
flat because those are offensive to the good god-fearing citizens who are true
to the word of the Bible.

~~~
thousandautumns
The flipside being that you are making the argument that private entities
should not be allowed to choose what content they host. You worry about flat
earthers censoring non-flat earth depictions, but a much more realistic threat
is flat-earthers forcing their insanity to have a larger share of voice than
they should by strong arming platforms and people that know better into giving
them a microphone, because were they not to do so, they would be accused of
"censorship".

No one is saying the NZ video should be purged from the internet. But it is
foolish to equate censorship with private social media companies scrubbing it
from their sites. You are welcome to search out other venues that host the
video, or host it yourself. Your rights are not being infringed upon if
Twitter and Facebook won't.

~~~
rndgermandude
Well, I didn't make this argument specifically here, and my answer would be:
it depends.

Common carriers, like Telstra (largest mobile Australian ISP), started
blocking websites hosting the video. That's not OK for an ISP to do. Same as
it isn't OK for e.g. a utility company to cut off power to a "nazi" (whatever
that means these days) or to Bernie Sanders supporters because the utility CEO
doesn't like "socialism".

Private entities actually hosting the content would be a different matter, and
the decision should be up to the private entity. Unless such a private entity
is a de facto common carrier in itself. I'd argue Facebook, twitter, reddit,
google e.g. are de facto common carriers running general purpose platforms -
given their size, influence and reach - even according to their own PR. They
are analogous to a utility company, and saying "you can host the content on
your own server" is like saying "you can just run your own generator for your
electricity needs/use solar panels and battery" or like saying "you can always
create your own telephone network if you don't like what the companies are
doing; all it takes is 2 old cans and a string".

In my opinion common carrier and de facto common carriers should not be
allowed to outright censor lawful content. Videos of newsworthy terrorist
attacks, no matter if it is made by a terrorist (e.g. this one) or just an
innocent bystander (e.g. 9/11, Las Vegas), are not illegal and should not be
allowed to be censored by such de facto common carriers/de facto utility
companies.

------
warning26
I know people on here will disagree with this, but I think many of these
events indicate that online censorship _is_ needed.

I believe it's becoming increasingly clear that merely allowing far-right
dialogue online is essentially responsible for ease of radicalization, and
that proactive censorship would be the best approach to forestall this.

~~~
marcinzm
>I believe it's becoming increasingly clear that merely allowing far-right
dialogue online is essentially responsible for ease of radicalization, and
that proactive censorship would be the best approach to forestall this.

Because of course, censorship only blocks the material you personally don't
want others to view and never material you personally would want others to
view. I can see a lot of politicians, especially those getting elected
nowadays, wanting to block any discussion of minority rights, LGTBQ+ rights,
trans rights, drug reform, abortion, women's health, women's rights, climate
change, non-christian religion discussions, etc, etc, etc.

edit: Note that this isn't a straw man argument. All it takes is one to look
at the censorship that happened in the US in the 50s and 60s (comic code,
movie/television code, mccarthyism, etc. ) to see what is possible.

~~~
yongjik
Yes, but on the other hand, the prevalence of far-right dialogue contributed
to the rise of far-right politicians. When they are in power, they use their
power to block the voices for minority rights, women's rights, LGBT rights,
etc. Do you think they will stop, thinking "Oh but far-right voices are still
allowed, it won't be fair if I silence women's activists"?

All things considered, I'm not convinced that censoring extreme far-right
contents is a long-term net negative for freedom of speech. (Well, it _could_
be, but it's not a given.)

~~~
marcinzm
>"Oh but far-right voices are still allowed, it won't be fair if I silence
women's activists"?

No, but they'd need to pass laws and fight court battles to enable censorship
versus merely taking over existing systems of censorship. The latter is much
easier to do and much harder to notice or right against. Especially when
merely fighting against it is grounds for censorship.

>Yes, but on the other hand, the prevalence of far-right dialogue contributed
to the rise of far-right politicians.

Depends on the scale you look at. In a sense, the far right is merely the
average person's view as of 50-ish years ago. Censoring to the view of the
average person merely ensures that in 50 years you'll be seen as having helped
far right ideas. Or in a different way, I don't believe that my current views
are the pinnacle of progressiveness (or morality or whatever) and I am loathe
to prevent future progressives from being able to push their views forward.

~~~
yongjik
> No, but they'd need to pass laws and fight court battles to enable
> censorship versus merely taking over existing systems of censorship.

I don't think that's a significant hurdle to any politician with popular
support.

Look at Trump, for example, not because I hate him (well I do, but that's not
the point here), but because he is a good example of what a politician can do.
He rammed into the freaking Supreme Court a candidate that was accused as
rapist. And there was nothing the opposition could do.

And the crazy thing is, Trump doesn't even enjoy unanimous support. Far from
it.

There's not much you can do once a politician is at power, has a semblance of
popular support, and is willing to forego established political norms. Laws?
They will change laws, they will take their matter to courts, and they will
win.

So I'm still not convinced it's a net positive to have an uncompromising legal
roadblock against censorship, when the cost is to _increase_ the chance that
someone will rise to power and kick that roadblock away like a tin can.

------
40acres
We have yet to have a real discussion about what we expect the largest
platforms to do about content like this.

The nature of online content is that once it's created it can be copied quite
easily, if Facebook was able to block 80% of this content that should be
commended.

AI is not at the point where we can expect 100% of this content to be removed
and if it was there would likely be some collateral damage that others would
complain about.

At it's current state we need a lot more humans screening this type of content
and who are we to subject other people to that type of mental disturbance?

~~~
ben_w
20% not being caught is surprisingly high to me. I would have assumed the
majority of the upload attempts would be byte-for-byte identical, and that
even if there were a dozen minor variations of the same content, they would be
able to deal with >95% automatically.

~~~
hombre_fatal
80% of the videos probably did match a simple sha1 lookup and were trivially
removed. Now what?

~~~
discreteevent
It's not an easy problem to solve if you want a system with an instant
response for the poster.

But if I am a journalist and I want to get something published it will take at
least a few hours to get it through an editor of some kind. FB could do this
even just for video. When a user pushes something up it takes a few hours to
be published. It would cost FB because some users wouldn't use it without that
instant hit.

------
jayess
One can find thousands of copies of the video of President Kennedy's head
exploding from the bullet of an assassin. All over youtube. What's the
difference?

~~~
dewey
It was not created by the assassin and spread as propaganda.

~~~
50656E6973
"Know your enemy"

------
nullc
I wish so many platforms didn't shove content in your face that you don't
want... there would be far less call to aggressively censor this material if
so many people weren't using platforms which were designed to cram things down
the user's throat.

People should be able to live their lives largely free of unwanted exposure to
this stuff. We shouldn't have to outlaw people who want to investigate it in
order to achieve that.

~~~
sasasassy
It's your own Facebook feed.

~~~
nullc
I'm having a little bit of a hard time figuring out the meaning of your
comment.

------
Waterluvian
It's absolutely awful. But im kind of glad it's impossible to fully filter.
That's an important feature of the web. It's just going to really really suck
sometimes.

------
dperfect
For more context into Facebook's decision-making process when it comes to
censorship, you might find this episode of Radiolab insightful (it was for
me):

[https://www.wnycstudios.org/story/post-no-
evil](https://www.wnycstudios.org/story/post-no-evil)

From what I recall: Facebook's dilemmas over how/what to censor have existed
since very early in the company's history. Internally, FB developed a policy
document - separate from the public one we see - which would be applied by
both human and AI enforcers. The employment of those human enforcers could be
considered unethical in itself (at least without providing adequate therapy).
Over time, the secret policy document has grown substantially to account for a
lot of edge cases. While terrorist/graphic violence is normally censored,
there are provisions for allowing content that would otherwise not be
seen/known if not shared on FB (e.g., content that governments seek to cover
up). Still, there is considerable internal disagreement among executives at FB
when it comes to handling edge cases.

The fact that this secret document (and the processes that define and apply
it) has so much power in shaping public perception, politics, etc. with so
little transparency is perhaps the most disturbing part of this.

------
OmarIsmail
All the "free speech" advocates here, do you still stand by that when it comes
to child porn, or specific threats against individuals? If you do, and
publicly, well... at least you're consistent. If you don't then you're already
ok with _some_ kind of censorship, and the argument is just about where the
line is drawn.

I really don't like when people talk about "censorship in general" when they
really mean "This is below _my_ line for what should be censored".

If you want to make the case for no censorship at all then you need to use the
most extreme examples of what will be shared and consumed because that's what
_will_ happen.

~~~
sasasassy
Yes. You should not go to jail just for having some specific sequences of ones
and zeroes on a magnetic device on your home.

You should go to jail if you do actual crimes, you know, with real victims.
Not imaginary victims that don't know when you see a video in the first place.

~~~
happytoexplain
While I agree with this notion generally, you're avoiding the relevant point,
which is that information is not something that inherently can have no direct
effect on the real world. Child porn is the usual example where you can start
to make a reasonable argument that possessing it creates a real-world effect,
not due to the possession itself, but due to the necessary action required to
possess, which is seeking it out and acquiring it from a provider, etc etc.

~~~
sasasassy
Acquiring child porn doesn't have to be any more difficult than acquiring
videos of beheadings and other vile acts. It was only made difficult to
acquire out of puritanism, there's nothing intrinsic about it that suggests
you have to commit further crimes to find it.

And if the idea is that the public is too dumb and easy to influence and
therefore cannot be shown child porn or killings in New Zeland etc then it
doesn't make sense at the same time for me to be advertised violent movies in
the cinema.

~~~
anigbrowl
And yet CP aficionados often seem to end up producing their own because it
becomes an object of trade on certain black markets. Here's an example from
just a month ago: [http://www.foxla.com/news/local-news/porn-actress-
director-a...](http://www.foxla.com/news/local-news/porn-actress-director-
accused-of-sexually-abusing-young-girl-detectives-searching-for-more-victims)

How is it that people who become interested in and start to collect CP would
ever become involved in its production, in your model? Having gained access to
a product they desire, those people seem to follow the same pattern as
enthusiasts of any other taste, and develop an interest in creating their own
material, as opposed to being satisfied with that already available.

~~~
Udik
Just out of curiosity, are you fine with AI-generated child porn?

------
matz1
This is why we need desentralized, hard to censor platforms. Or at very least
a competing platform that has different stand on the issue.

~~~
amanaplanacanal
Other platforms that don't moderate already exist. Unfortunately, because they
don't moderate, they are so awful most people won't use them.

------
alexcnwy
I build a lot of computer vision systems and did my MSc on classifying what’s
happening in video and I am VERY skeptical that it’s possible to prevent this
from happening (without having a person watch every video).

You can edit a video to make it appear as safe content and it’s a cat and
mouse game because you need humans to annotate those variations before a ML
system can detect them (an unsupervised model will likely lead to loads of
false positives).

I think the solution is better control over what we see and who we trust
enough to allow them to broadcast into our brains.

------
internet_user
It is a regrettable situation that is playing into attacker's hand.

Now a single tiny state gets to censorship worldwide? How is that not an
inflammatory action?

------
aussieguy1234
I'm not sure how Facebook stores media assets.

But it would be stupid to duplicate a video for a shared post more than once.
A one to many relationship (one video used by multiple shared posts) is more
likely.

If that's the case it should be easy to delete the original video, then all of
the shares loose their content.

~~~
huffmsa
But trolls are smart and don't share, they reupload.

Which is still easy to catch if you hash the video. But doing something like
mirroring or otherwise making a small tweak to the video breaks that hash
lookup. Hence 20% making it through.

------
MickRapp
The body cam video clip showed the spent casings disappearing into thin air.
It was a doctored video with special effects added. I saw it and reposted it
on both my Twitter and Facebook page. This video covers the discrepancies. Two
men in baseball caps were also shooting.
[https://youtu.be/dOCwYd4ogFY](https://youtu.be/dOCwYd4ogFY)

------
skybrian
I'm thankful Hacker News doesn't have pictures or video, and it's too bad more
online forums don't do the same. More places should stick to plain text. You
can link if you need to.

------
etaioinshrdlu
I've done some work in the AI content moderation space and I have come to
suspect that while computer vision is advanced enough to probably detect all
of this, the neural nets are so big and deep that it costs over 10x what
anyone is willing to pay to run, on such a large scale.

It's like looking for needles in a haystack, and it is expensive.

There is a cost-quality tradeoff with these neural nets as well, but if you go
cheap, then your accuracy isn't useful either.

I don't know such a sweet spot exists today where it is both cost effective
and useful.

~~~
mr_toad
I wouldn’t have thought that scoring video with a pre-trained model would be
much more computationally expensive than other routine operations like
transcoding and compression.

~~~
etaioinshrdlu
But it is indeed quite a bit more computationally intendive, especially for
the highest quality models.

------
Simulacra
People around the office were watching it. Shockingly easy to find.

~~~
tibbydudeza
You have ask what sort of people would watch this ???.

~~~
mrkstu
The same people who gathered around TVs on 9/11 after the first strike and
then saw the 2nd strike live? So most of the country?

~~~
dghughes
On the morning of September 11, 2001 nobody knew there would be a second
attack on the other tower. I distinctly remember everyone including myself
thinking about the B52 that accidentally ran into the Empire State building.
We were all at work so it wasn't like we were watching TV.

You're assuming at the time that everyone knew it was a terrorist attack,
which is false.

Nobody knew that until well after the second plane hit. And even after the
third plane crashed into the ground.

The only reason anyone watched TV was because it was a major event with a
dramatic incident. Most of the people at the time in my town didn't turn on
the TV I had to ask several times at a bar to have the TV turned on. Even then
people had no idea what was going on.

~~~
mrkstu
No assumptions. We were gathered around the TV in the conference room after
the first one hit and part of the conversation was whether it was terrorism,
since it was pretty obvious you don't just happen to ram into the side of the
tallest building in the country.

But your comment is a non-sequitur regardless- even once we all knew it was a
terrorist attack it was replayed endlessly on TV for months, including the
people throwing themselves off the building.

I haven't watched any of the footage of the New Zealand attack, but to somehow
say it is exceptional to watch the footage is historically speaking,
incorrect.

~~~
XorNot
There's a pretty big difference between watching an event that killed people,
and watching people - individuals - die.

~~~
tomp
Can you elaborate on the difference please, I’m not sure I see it myself...

------
Const-me
I wonder if I’ll try to upload a music video by Britney Spears, what will be
the false negative rate of FB’s censorship algorithms? Will it be the same
20%?

------
onetimemanytime
>> _Facebook failed to block 20% of uploaded New Zealand shooter videos_

Fixed title: Facebook blocked 80% of uploaded New Zealand shooter videos, or
1.5 million of them.

------
jbverschoor
And how is world supposed to be eaten by AI?

------
jimjimjim
I am almost as saddened by some of the responses here as I was after hearing
about the shooting.

Have you all become so gradually desensitized to violence that absolutely
nothing is beyond acceptable?

Are YOU, personally, YOU, happy with how things are in the world?

~~~
charlesism
That anyone feels the public has some "right" to watch other people die turns
my stomach. I didn't grow up on 4chan, or watching far-right propaganda on
Youtube, so I still see sadism as a character flaw, not a virtue.

~~~
jimjimjim
thank you.

~~~
charlesism
At any rate, as far as distribution _by private companies_ goes, that's my
view. Before someone jumps on me, I don't think _government_ should outlaw
documentation of atrocities, etc.

------
wankerrific
Facebook and YouTube would sure as f--- fix this problem if, you know, it was
the law that they hold liability for any and/all content that was hosted on
their platform that they were, you know, profiting off of by displaying ads
next to said content.

But that would require regulations which is bad for job creations or
something.

------
Animats
The US has shootings this big most years now. Las Vegas festival shooting,
2017, 59 dead. Sutherland Springs church shooting, 2017, 27 dead. Orlando
nightclub shooting, 2016, 50 dead.

Yes, guns don't kill people, people kill people. But for a high body count,
you need a good assault rifle like the AR-15 family, the choice of most mass
shooters.

Streaming your mass shooting is fully supported by off the shelf products.
GoPro sells an official mount for mounting a camera on a rifle.[1] Video of
deer hunting with this.[2] Instructions from GoPro on how to set up your
camera for live streaming.[3] It's so easy now that anyone can do it.

And it's all legal in most US states, right up until you pull the trigger.
America, Land of Guns!

[1] [https://www.amazon.com/GoPro-Sportsman-Mount-
Official/dp/B00...](https://www.amazon.com/GoPro-Sportsman-Mount-
Official/dp/B00F19Q4LY) [2]
[https://www.youtube.com/watch?v=zAOTO5ydGbE](https://www.youtube.com/watch?v=zAOTO5ydGbE)
[3] [https://gopro.com/help/articles/block/getting-started-
with-l...](https://gopro.com/help/articles/block/getting-started-with-live-
streaming)

~~~
itchyjunk
I don't follow. Ban certain guns and GoPro? What happens when someone crazy
starts stabbing people with hunting knife? Ban knives? Ban live streaming? I
don't follow how you turned a shooting in NZ into a `America, Land of Guns`
rant.

~~~
XorNot
Because it turns out that where this happened, New Zealand, has much laxer gun
laws re: semi-automatic rifles, then where the shooter was from - Australia.

Quite directly, this person went somewhere where it would be easier to acquire
the guns he used in the attack.

Unlike America, New Zealand is probably going to implement Australian style
laws after this.

