1) The British government have built AI tech capable of identifying extremist views in any video at the incredibly low price of £600000, doing what no other technology company in the world is capable of.
2) A smart consulting company has convinced the British government to waste £600000 on building a model that can flag certain videos from a very specific training set but in real world use is horribly innacurate and completely useless.
My guess: someone enterprising has seen her on TV and spotted a business opportunity. We should be grateful that they only took us for £600k.
I'm sure some other companies could have done it for 100x the cost using "proven technology" (that doesn't work for the problem at hand but sounds nice on paper)
To be honest I actually think if you just look at metadata (IPs, locations, services, usernames, previous uploads, titles, etc.) you can probably get the percentages they are talking about, at least until people start marking up the extremist material in different ways. The article even says they don't want to say how the technology works which implies it'll be trivial to get around.
* that the false positive rate is going to result in more legal material being taken down than ISIS propaganda by an order of magnitude even if what they claim is true.
* That the success rate on the data set you've been training on is highly likely to completely mis-represent your true success rate.
* That the processing power required for this is likely prohibitive. Not capable of being run real-time, and any actual implementation will have much worse results due to performance optimization.
That's laying aside the deep and obvious problems with the government forcing private companies to censor legal speech with no over-sight or even a human in the loop. Let alone law enforcement in the loop.
What this press release is, is a calculated attack on free speech. Deliberately misleading the public about the capabilities of technology to attack the technology companies they claim to want to work with. To apply public pressure to private companies to do police enforcement jobs.
The only response to this is to state the obvious: If the government wants something censored they can apply to a court injunction as is due process, and in the mean time, let's get rid of this abhorrent stream of Home Secretaries.
* No one quotes metrics based on the training set, so I'm sure these guys haven't done that.
* Most trained ML models don't require much computing at all to make predictions.
* The idea seems to be to flag content for review by a human, so there is a human in the loop
* No one is forcing anyone to use this... so there's no straight forward censorship issue
* maybe this does embarass bigger companies into explaining why they don't develop there own system to do this (removing IS content seems uncontroversial) - I presume this is why the Home Sec is really interested
I'm not sure what this means. It may be a 0.005% false positive rate. If they're scanning, say, Youtube videos, then the resulting false positive number would be huge.
As much as I applaud efforts to stop extremism, censorship in this form is concerning. Who is the arbiter of what counts as extremism? Obviously any website urging people to join Daesh should be blocked by their standard, but what about websites promoting the PLO, or the PKK? How about websites about the Rohingya? The Burmese government certainly seems to think they are terrorists.
Answering a rhetorical question here, but the UK is an unreconstructed colonial government left over from the Empire. In some ways it hasn't really adapted to the independence of Ireland, despite that being nearly 100 years ago. Instead it runs a system designed for control of "distant savages" from London on an absurdly reduced scale, like someone trying to use Kubernetes for their "hello world".
Three of the four home nations have colonial office structures. There's a reason there's no Secretary of State for England.
Even within England, places like Stoke-on-Trent and Great Yarmouth get treated as if they were in remote inaccessible jungle because you can't get there on an Oyster card.
Brexit is where we find out which of the bits of the British constitution are load-bearing.
(The PLO are not a proscribed organisation, but Hamas and the PKK are: https://www.gov.uk/government/uploads/system/uploads/attachm... )
Disclaimer: Am Swedish so have absolutely no idea
Remember, it was under a Labour PM that the UK invaded Iraq. Civil liberties were steadily eroded through a series of anti-terrorism laws during Blair's tenure.
Of course, things might be different now that Corbyn is running the Labour party, but it's difficult to undo what has been done. Once power has been gained, the holder is usually very reluctant to let it go.
Old Labour were characterised by a weak state and strong unions; New Labour were characterised by a strong redistributive state that was safe enough for middle-class voters; it remains to be seen what Momentum Labour actually do, if anything.
Neither "side"'s stereotype actually wants free speech. That's turning into more of a centrist thing, simply by attrition via tribalism of the "two sides" (which itself is a false dichotomy).
But a series of Home Secretaries have been hard at work to erode online liberty, from both the major parties.
I give you Jack "Boot" Straw :
The courts have recently said they can't allow the cops this much power, but May has already pre-empted this with some purposefully half-baked "improvements" to the law, so that it looks like they're changing the law according to the ruling, but in really they're not. Then they'll have to be sued again, which will take a few more years. And then they'll pass some new half-baked improvements, and so on. It's a game they seem more than happy to play and drag out real reforms as much as possible.
May for instance knew the EU Court would ban that type of surveillance (the court's top advisor tends to signal how the ruling will go many months ahead), so she rushed IPA through Parliament like a month before the EU Court's decision. It's just goes to show how mischievous these guys are.
— Sid Meier's Alpha Centauri
It'd all be a laughable trainwreck... if it didn't set further precedent of censorship and potentially destroy people who fall into false positives.
One day I fear I will open my eyes to find parts of the world blurry, because of state-mandated image filtering device, embedded in my eye.
Turning the notion of ISIS recruits into a joke (which, sadly, they actually kind of are) would potentially stem their flow. Censorship, however, will be interpreted by potential recruits as "we are afraid of the truth", will harden their resolve and they'll figure out a way to get around it anyway.
Amber Rudd and Theresa May are simply trying to create the infrastructure for a police state, though, using ISIS as a pretext. They're authoritarians at heart and always will be.
If anything, I can see human moderators having to do more work to restore falsely flagged content than they currently have to do in removing content - but humans are lazy, so instead we’ll just see content going in the “extremism” bin, along with a mandatory report to the government, and people being taken to court for spreading extremist content, which is actually cat videos.
It’s a Kafkaesque nightmare in the making.
If the UK government are going to throw their weight behind a tool, they'd better hope that it's of sufficient quality to not be torn to shreds by some of the most technically competent people in the world.
The UK government has always had a hostile approach to technology. Now they want to give a tech company the chance to be hostile back, and if this tool doesn't work I can see a very public response to the legitimacy of this tool.
Sure, there's a lot of crap written about Google, Facebook, Twitter, and the like on printed media, but given how reliant UK broadcasters are on social media I'm surprised a negative reaction hasn't led to Twitter banning journalists, or Google de-listing a publication for hate-speech.
In the end, the problem with censorship is that it gets defined by someone who decides what is extremist and what's not.