
Four steps we’re taking today to fight online terror - MilnerRoute
https://blog.google/topics/google-europe/four-steps-were-taking-today-fight-online-terror/
======
DuskStar
"Third, we will be taking a tougher stance on videos that do not clearly
violate our policies — for example, videos that contain inflammatory religious
or supremacist content. In future these will appear behind an interstitial
warning and they will not be monetised, recommended or eligible for comments
or user endorsements. That means these videos will have less engagement and be
harder to find. We think this strikes the right balance between free
expression and access to information without promoting extremely offensive
viewpoints."

I cannot help but think that this will not be applied evenly - that some
political content will be allowed and some will not.

~~~
odiroot
Aha, so big US companies are our police now.

I share your fear. The "kosher" content would be now defined by some faceless
person in NY/SF.

~~~
3pt14159
They aren't installing a filter into the _browser_ they're OPENLY addressing a
problematic issue on YouTube, one of many, many video hosting websites.
Nothing stops a terrorist from getting a computer, an internet connection, and
hosting his own damn video calling for the murder of women and children.

When you call for violence against non-combatants you're breaking the law in
every single western country. If there were only one web browser and the
company behind it were implementing universal blocking measures _maybe_ I'd
agree with you, but honestly I'd have to think long and hard first.
Radicalisation is impossible to survive in the long run as the average power
an average individual keeps going up.

~~~
bobabooey02
They didn't say they were going to start removing videos with illegal content,
they already do that. They said that they were going to start removing videos
that don't break any rules, yet the company deems them unsavory. Which is
incredibly frustrating since 1) YT has become the center of our changing
culture, and 2) not everyone lines up with the PC Californian culture that
dominates large multinational corporations.

~~~
yorwba
They didn't say they would remove the videos, instead they will display an
"interstitial warning and they will not be monetised, recommended or eligible
for comments or user endorsements." Which is not even on the level of a shadow
ban, as practiced e.g. on HN.

~~~
forgottenpass
I wonder what effect Google's wagging finger and implied scolding from an
interstitial will have on people who stumble across a video they like but is
branded as naughty.

I find it an interesting question because:

A) Not every video branded as culturally unacceptable will be. Not every video
is as bad as the worst-case hypothetical used to justify the content
classification.

The landscape of cultural attitudes differ from California-based content
minders. The categorization can be flat out wrong, there will undoubtedly be a
small percentage of videos that even the minders see as mis-classified.

B) Social interventionist policies can - and often do - backfire.

e.g.: Teens that deliberately seek out taboo. The allure of R movies, M games,
Explicit Lyrics, and underage binge drinking can cause them to live a period
of their life less well-adjusted than if that content wasn't aggressively
filtered from their lives in the first place.

------
nabla9
In the west terrorism works trough media exposure. Fighting it in media is
good strategy. Violence itself is ridiculously low volume per capita.

If traditional media would step in, it would be major win. Instead of having
symbiotic relationship with terrorism, where they hype up the terror, they
could stick to reporting the facts and situation and not providing non-stop
terror-porn entertainment to audience.

Once people realize that terrorism just one small risk among other must bigger
risks (like high rise buildings with flammable material) they stop fearing it
and the terrorism stops being effective. It will never completely vanish, but
it will not attract radicals as it used to.

~~~
matt4077
I don't agree that traditional media outlets "hype up the terror". If you
compare the coverage of, for example, the recent attacks in London and the
fire you're referring to, they seemed to get about equal billing.

On the smaller scale, violence will get more prominent coverage, yes. But
that's just a reflection of public interest, and how much intent makes a
subjective difference. News outlets like the BBC or the Guardian seem to be
far away from "terror porn", and in the case of the BBC, it's obviously wrong
to suggest they're doing it for money, seeing as they're not financed by ads.

~~~
hueving
How often do you hear about shootings in South Chicago in world news? 294
people have died there so far this year![1]

Terrorist events should be treated like any other murders and people will very
quickly stop caring. They are just hyped up by the media to drive views and
the governments love it because it allows them to expand their power.

1\. [https://www.dnainfo.com/chicago/2017-chicago-
murders](https://www.dnainfo.com/chicago/2017-chicago-murders)

~~~
thinkingemote
I used to hold this very view but now I don't watch the news. The reason why I
don't hold this view is in the very name and nature of the news, it's "new".
Everyday commonplace events even horrific ones are not new and therefore are
not part of the news. Choose to not watch.

------
nippples
> We have also recently committed to working with industry
> colleagues—including Facebook, Microsoft, and Twitter

All of those platforms that are notoriously bad at curbing terrorism and great
at promoting the most asinine "I can't believe it's not a troll" social
justice concerns.

Also, not to mention present day Twitter gets a lot of money from wahabbi-
funding Saudi Royals: [https://qz.com/131532/meet-the-people-and-funds-that-
stand-t...](https://qz.com/131532/meet-the-people-and-funds-that-stand-to-
profit-from-twitters-ipo/)

------
bamboozled
I truly hope these changes will include the US and Allied forces combat
footage.

I've seen videos on YouTube contain some of the most shockingly needless
violence with complimentary cheering and derogatory language used by British
and US soldiers (often directed towards foreign people) one can imagine,
disturbing.

It seemed like footage aimed at radicalising people to me.

~~~
3131s
Do you mean to say that the footage should be censored? I think their crimes
should be seen be all.

A better stop-gap measure against the radicalization of American terrorists
would be to prevent the DoD from recruiting at high schools, at sporting
events, in movies and video games, etc.

------
jscott0918
This is a terrifying step for the internet. These kinds of step lay the
groundwork for censorship but fail to consider the impact of these changes in
the event that the bad guys come into power or gain influence of these
management systems.

~~~
remotehack
Supporting violence and hate isn't acceptable in any other format, why should
the Internet be excluded.

~~~
initstick
While your statement is quite vague it is no less completely inaccurate.
"Supporting violence" is subjective and ambiguous. Are action movies and video
games supportive of violence? How about war documentaries?

Furthermore, while "hate" speech is not the best use of our ability to
communicate it does fall under the protection of free speech - at least in the
United States. In some groups hate speech (however it is you describe it) is
acceptable.

Just as you are free to not associate with those groups the internet should be
free to express the ideas you disagree with.

You don't have the freedom to interfere or disrupt the freedoms of others.

~~~
remotehack
Actually just went back and re-read what I wrote, I'm not entirely sure what
the point was I was trying to get across now.

------
avaer
This scares me in ways terrorism never has.

I'm not sure terrorism has ever accomplished its goal to effect radical
change. But a powerful entity ostensibly protecting us from a group perceived
as a common evil that's not worth debating?

------
dcow
I am vehemently opposed to babysitting society, but violence is not speech,
it's action. That much seems fair. I do worry about step 3. It sounds like
that's going to silence a lot of minority opinions or screw over groups who
post no-so-liberal commentary videos etc. Anything hiding behind an
interstitial is just crackpot conspiracy theory.. Google is a negative
influence on society bleep. Zap.

~~~
TazeTSchnitzel
It's surely better than the current situation of literal neo-Nazis openly
talking on YouTube about “white genocide” and how the “Jewish question” must
be “solved”, without Google seemingly so much as batting an eyelid.

~~~
DuskStar
I'd rather have the neo-Nazis spewing their bullshit out in public, where
everyone can see how stupid it is. Driving something underground only makes it
more attractive in the end - after all, if it was wrong The Man wouldn't need
to suppress it, right?

~~~
flashman
The problem with allowing neo-Nazis to spew their bullshit alongside the Diet
Coke and Mentos videos is that it creates a normative appearance: "this isn't
being removed, so there is something tolerable about it, so maybe it's not so
bad."

> Driving something underground only makes it more attractive

See, I think you're wrong, and that the secret to attraction is visibility. As
an online advertising company would know.

~~~
DuskStar
But don't the neo-Nazis and ISIS already have visibility? Again, all I think
this does is add the allure of the forbidden fruit.

------
WatchDog
This is clearly in response to the call from several world leaders for more
participation from google et al. in combating terrorism online.

Theresa May, as well as some Australian leaders have made very general
statements that the big tech companies are not doing enough to fight
terrorism.

I expect that these statements are merely a prelude some announcement of
legislation, seeking even greater government powers and access to googles
data.

I imagine law enforcement would rather have detailed identity information on
the people consuming the content, rather than simply having the content
removed more efficiently.

So while removing propaganda and violent videos from youtube is just going to
drive the traffic elsewhere, at least googles action may help resist further
government encroachment on our online privacy.

------
wowSuchTerror

      to fight online terror
    
      1. identify extremist and terrorism-related 
         videos
    
      2. increase the number of independent experts 
    
      3. a tougher stance on videos
    
      4. expand its role in counter-radicalisation 
         efforts
    

But I'm not terrified by things that happen online. Least of all videos of
events in the past.

And, to be honest, I've seen plenty of NSFW videos.

Let's not kid ourselves. Terrorism has pretty much nothing to do with videos.
Radicalization has more to do with the social realities faced on the ground,
day after day, as we live our lives. Alienation and the misery of slums, and
hopelessness of monotony without progress, are what lead to terror.

The true effects of propaganda are over-stated and over-imagined.

~~~
jjoonathan
Yes, but that's a less useful story for the people and organizations built on
gathering ever more power, so the alternative will be repeated until it
sticks. Given the number of people in this thread who take the official
narrative at face value, it looks like it's working.

------
lossolo
Democracy has it's flaws, but I don't know any other system that works better
for society today. One of democracy flaw is balance between freedom and
safety, you can't have both. You can easily resolve this (terrorism) problem
in totalitarian regimes (look at Kadyrow and Chechen Republic) but whole
society will pay price for that. On the other hand it's easy to criticize any
censorship like moves but radical islam ideology spreads like a cancer through
Twitter, youtube and other social networks. You can fight terrorists, kill
them but you can't kill the ideology. If you can't kill ideology you need to
control it's spreading. What tools do we have to do that aside from moderating
content?

Lately I was watching some live stream on youtube about UK terrorist attack to
get information about what's happening. I've seen what couple of thousand
people wrote on the chat below the video. I didn't seen so much hate,
intolerance, racism, fascism from so many people at same time in my life. I've
closed it because I couldn't read that, the problem is that probably many
young people didn't...

This is very hard problem without any satisfying solution. On one hand we
don't want to give up our freedom of speech, we are fearful that some evil
actor will use it to manipulate our society, on the other hand, ask parent of
a dead kid that died in last UK attacks or any other attacks, ask yourself,
what you would give up to get your dead child back.

~~~
newscracker
Frankly speaking, my observations on the actions and expectations by various
"democratic governments" around the world lead me to conclude that the world
as we know it is fast moving to "totalitarian democratic regimes", where
instead of one totalitarian leader/pack controlling and oppressing people for
generations (like in the case of dictatorships), we have the illusion that we
have the freedom to vote out and vote in different parties/leaders, who
eventually toe the lines of the previous elected governments and end up making
things even worse overall. Maybe I was naive and ignorant and all this was
happening all along over decades and centuries.

The increasing tilt in the balance of power toward the state, on the balance
of power equation between the state and the individual, is quite alarming. It
seems to me that world over, we're allowing monopolistic views and policies to
grow, to the detriment of common people. Terror and terrorism are very useful
for those in power to gain even more power. Terrorists who claim to fight for
freedom are winning by showing practically that those in power are continually
curbing our freedoms. With such policies, tech companies are just joining
forces with "democratic oppressive governments" in making them more powerful.

I do not see a clear and easy solution for these problems.

------
bhami
Let's see, this is the same Google/ YouTube that marks dozens of PragerU
videos as "restricted", despite three Wall Street Journal editorials
protesting that. The same YouTube that has been demonetizing Paul Joseph
Watson, Alex Jones, and other conservatives. In summary, Google has long
proven that they have an unapologetic Leftist agenda.

~~~
patrickg_zill
Remember that in the last election, virtually zero newspaper, magazine (eg The
Atlantic) or any other media outlet endorsed Trump in their editorials.

And yet these same organizations assure us they will have clearly defined and
wholly impartial standards.

Uh huh...

------
Fluid_Mechanics
Radicalization is a very real issue as we're getting ever more entrenched in
our curated echo-chambers - I'm glad they're taking a stand.

Unfortunately I fear the damage has already been done, and these policy
changes will almost certainly push newly-popularized radicals onto platforms
Google has no meaningful influence over.

------
TheRealDunkirk
I should start an over/under pool on how soon we'll start seeing the "OMG they
banned my channel for THIS, and there's no person within 1000 miles I can find
to fix it" type of posts.

------
pfortuny
So instead of the rule of law we have reached the rule of google.

Funny development.

------
AlphaWeaver
Interesting that they didn't address specific steps they'd take to protect
their content reviewing teams... Sounds like it would be appreciated,
especially in the context of the recent issue at Facebook.

~~~
novia
_especially in the context of the recent issue at Facebook._

I'm possibly out of the loop, what happened at facebook? Are you talking about
the fake news fiasco?

~~~
AlphaWeaver
In short, Facebook accidentally revealed the identities of the moderators
responsible for taking down stories related to terrorism, to potential
terrorists. The HN story is here. [0]

[0]:
[https://news.ycombinator.com/item?id=14572585](https://news.ycombinator.com/item?id=14572585)

~~~
novia
Wow.. the real-name policy at work.

~~~
dredmorbius
So to speak, yes.

------
camus2
That's an excellent news. I was absolutely disgusted by the amount of pro
Daesh content on Youtube 2/3 years ago where you could see their videos that
became famous for their high production value. Youtube absolutely contributed
the spread of fundamentalism no question. And there are still a bunch of
hateful Nasheed easily accessible but Youtube is still not removing them
because "it's music" ? outrageous ...

------
a_imho
I'm rather skeptical when terrorism (or Lovejoy) is invoked as a reason for
more censorship, hope it turns out well this time.

 _we are working with Jigsaw to implement the “Redirect Method” more broadly
across Europe. This promising approach harnesses the power of targeted online
advertising to reach potential Isis recruits,_

It is not really clear why is Europe specifically targeted.

~~~
klez
> It is not really clear why is Europe specifically targeted

If you mean "targeted by attacks" I'm not sure why.

If you mean "targeted with this method" is because most of these kinds of acts
in the West happened in Europe and not somewhere else. Also, IIRC, most of the
attacks came from people born here, not immigrants, so it wouldn't make much
sense to target wannabe attackers directly in the Middle East.

------
wiz21c
From YouTube Hero's Program rules :

[https://support.google.com/youtube/answer/7159025](https://support.google.com/youtube/answer/7159025)

>>> Participation in the Program by some participants may be restricted (e.g.,
no or limited perks), including persons who are government officials,
including (i) government employees; (ii) candidates for public office; and
(iii) employees of government-owned or government-controlled companies, public
international organizations, and political parties.

that tells much...

~~~
creepydata
What does it tell?

~~~
wiz21c
It tells that Google is completely opaque on who gets hired and who doesn't
(because the sentence is fuzzy). Google is a private company so it seems
logical that they have the last word. However, given the size of Google, I'd
be happy if the ethical argument about what is an acceptable video would come
at the table too.

The fact that Google tries to do something about terrorism is absolutely not
equal to the fact that it is efficient at that. For us to know that, we must
be able to review their results at that (and how do we even define a result in
this area :-/ ) as well as their methods (which requires transparency).

------
dcow
This is has become another one of those mysteriously-disappeared-from-the-
front-page-of-hacker-news posts.

------
blfr
I don't believe that there is a problem with terrorism online or that such a
thing as online terror exists. Only the most feeble can be terrorised through
the screen.

The problem is that there are terrorists among us who are willing to bomb,
stab, or run people down with a truck. That there are entire communities where
this either receives support or is at least tolerated.

It has very little to do with technology. They don't do it because they saw an
ISIS video on YouTube or a post on Facebook. I doubt this is even a minor
factor. Hodgkinson watched The Rachel Maddow Show. On cable.

~~~
Jarwain
I wouldn't say that it's about individuals being terrorized through the
screen. I'd say that it's at-risk individuals who are persuaded to radicalize
through the internet. By at-risk, I refer to individuals who may be suffering
personal issues in their life, and are susceptible to the messaging used by
terrorist recruiters.

~~~
forgottenpass
If there is one thing at-risk people respond positively to, it's ham-fisted
social intervention that DGAF about them and openly seeks to control their
behavior.

------
amelius
Will every internet company have to reinvent their own counter-terrorism
tools?

That doesn't sound very efficient to me.

Also, isn't this a task for governments?

~~~
tonyedgecombe
I'm more worried about government getting involved than some loss of
efficiency.

------
theparanoid
Hello Big Brother.

~~~
Buge
Do Hacker News moderators count as Big Brother? An website can decide how it
wants to moderate itself.

~~~
akerro
Google said they will cooperate with governments and 3rd party organisations,
which means they will have a voice on what government 'thinks' is good or bad
and they will use that power to impact our political-social-legal regulations
and standards in a way google sees it. At some point AI created and provided
by google will be creating law and regulations in some cities or states...

------
sashok_bg
more and more regulation = less freedom

Simple equation, no way out of it

------
TausAmmer
Лайки Крутятся, мемасы мутятся.

------
hammock
They say they are fighting "violent extremism online" but how can you be
violent online? That's a physical thing. Instead, they outline four points for
greater censorship of videos.

~~~
codedokode
You cannot be violent online but you can spread lie and propaganda and some
people could believe it. Governments have a long history of doing it.

------
SCHiM
Personally I'm opposed to this, assuming google's goal with these steps is to
protect open societies. Not because I think you should be able to post content
without restrictions, but because I think it serves society better to get
desensitised to this type of violent content.

In the context of 'protecting' society:

IMO it's better to protect your society by removing or blocking blatant
propaganda, but I don't think that ban should be extended to violent/gruesome
content. The less you see of it, the harder it will hit once it gets past the
blockade. I consider it a healthy human response to be able to see such images
and think "Fuck those <INSERT_GROUP_OR_STATE_HERE> assholes", and then move on
with your day. Shielding people will have the opposite effect, since people
will not be familiar with the possibility of seeing such things and it will
therefore retain it's shock-value/utility for the <INSERT_GROUP_HERE>.

No matter how much (western?)society might fool us, humans can deal with this.
We've lived as savages, constantly tested by nature and the cruelty of life,
for most of our existence on earth. I'm not saying that we should return to
such a state, but we also shouldn't needlessly cuddle responsible adult people
either.

~~~
killjoywashere
It's not the blood and gore that we can't handle. It's the psychological
manipulation. They're using the same psychological tricks that Trump used,
that Colgate uses. That every advertiser uses. But now it's war. This is the
new war: the war for the mind of man itself.

~~~
remotehack
Are you equating a fair American Presidential campaign to people who cut
people's heads off and record it?

~~~
killjoywashere
No. They are as different as can be. But there are many kinds of different.
Some are better than others. And neither of those are good.

