They weren't able to spy in bulk when communication was primarily offline, and they won't when it's primarily encrypted.
Don't let them frame the brief, anomalous period when they could listen in on everyone, as 'normal'.
Because of our state-of-the-art security, we're now able to do more things online in less secure environments. A secure, distributed internet is normal. One that is insecure by design is not.
Cellphones became so cheap and widely available you could buy and throw them away efficiently to not get easily tapped and still make enough money, even as a low level drug dealer living in low income neighbourhoods. That wouldn't have been possible years prior.
It's not a 20th-century phenomenon by any means.
They were steaming open, photographing mail, then sending them on, and the photos to MI5. They were also responsible for recording phone calls so had presence in major exchanges.
We have telephoto lenses and a 50 megapixel camera is only $4k. Imaging has gotten both substantially cheaper and substantially better.
We also now have IMSI-catchers that can easily be deployed. Which makes wire tapping much easier.
By all accounts everything that could be done in the past can be done substantially better and for much cheaper while having a lower risk of an operative being caught by trying to place the spying device (since we can increase the distance). Which also means the term bulk has changed.
I'm sure reading encrypted message would make their job easier too. But I'd need to see some hard data on how effective advancements in other areas have enabled catching terrorists, because I highly doubt they have.
Also, backdoors seem self sabotaging. If you have operatives in another country don't you want them to have easy access to encrypted communication lines? If only spies use SuperSecretCommunicationApp then that's pretty easy to trace. I've understand this to be the whole reason for releasing Tor to the public. Gives a lot of cover. Besides the fact that your enemies are also going to get ahold of any backdoor created.
If undercover agents can work mostly from the comfort of their home or office because they operate online, they might be able to spend time on other activities, but not to infiltrate two organizations at once.
If some speech recognition AI finds interesting bits in conversations so that the same analyst is ten times faster at examining wiretapping output, it doesn't mean catching terrorists ten times faster.
Listening to wiretaps is only a small part of the work and Amdahl's law applies; moreover more data and better data analysis tools tend to improve quality, not to reduce effort (in this case, it becomes affordable to snoop on ten times as many people).
This kinda contradicts itself. Maybe the answer isn't 2x but like 1.3x or something. But that's all my point. That it makes it easier.
No one is saying that it's a linear or exponential relationship. I'm sure everyone here would expect the effectiveness to be sublinear.
The problem though is that the number of people being snooped on isn't 10x. Not even 1kx. More like 1e6x. With that much more snooping if we aren't getting terrorists 10x faster (which let's be real, that's poor gain), I don't know how this is even remotely justifiable (maybe a good argument could be made at 1000x catch rate, but that's leaving out moral questions).
If you're going to talk about Amdahl, let's talk about Pareto. We know that catch rate is going to have a logarithmic catch rate compared to energy spent. Most terrorists will be easy to catch. The last 0.001% will take a ton of resources to catch. This is a key part to Amdahl's. You don't just throw more and more resources at the problem. You don't gain at a certain point and are wasting resources. If we're spending all that money, time, and sacrificing all that freedom just to scrape the bottom of the barrel, then I'd argue that this isn't a good use of resources. That's the issue at hand here. Amdahl's Law is the issue. People don't care about targeted surveillance. People care about mass surveillance that doesn't meaningfully improve the catch rate. It's a waste of money and an over reach in control.
But again, if you can show me data to suggest that the catch rate is much better then I'm open to changing my mind. But I'm highly skeptical that this is the case, because it doesn't match the intuition of the above principles.
This means government can't do it in bulk, and it would out of the reach of an ordinary individual, but it would still be possible.
When surveillance is being rolled out, it's always just about a few single cases here and there. And then gradually, silently, the number increases.
The argument makes no sense.
- She resigned from her previous position as International Development Secretary in 2017 when it was discovered she held secret unauthorised meetings with Israeli officials and lied about it. The meetings were not sanctioned by the Foreign Office and were a breach of ministerial code.
- A supporter of Brexit, she suggested last year that the UK leverage the prospect of food shortages in Ireland in order to gain a better Brexit deal. Although, she quickly back-pedalled on her comments, she was rightly criticised for her remarks.
The depressing reality is that the current Conservative Party in the UK is stuffed to the rafters with nasty politicians just like her.
Priti Patel's voting record in parliament: https://www.theyworkforyou.com/mp/24778/priti_patel/witham/v...
"Generally voted for requiring the mass retention of information about communications"
"Voted for mass surveillance of people’s communications and activities"
Most of the reasonable and moderate members of the Tory party are on the back benches, leaving or about ready to retire. The old, reasonable, one nation Tory party is dead as a dodo.
Consider that it's the first one which appears to be serious about actually doing what the government repeatedly said it would do, both before and after the vote. A government doing what it promised it would do is reasonable. It is led by a man who wants very much to reach an acceptable deal with the EU, but will leave without one if the EU makes it necessary. That's a reasonable and moderate position of the sort that millions of business leaders take every single day.
The previous cabinet had a position like this: we're saying we'll leave no matter what, but we're lying because we definitely won't ever leave without a "deal" of some sort, which basically means the party we're negotiating with can propose whatever terms they like and we'll always accept them regardless of how terrible. Thus an "agreement" which is universally regarded as awful is presented as the only possible path forward, other than ignoring the biggest vote in British history. That's not at all a reasonable way to go about negotiations or politics. Nor is it even slightly moderate - "we must accept terrible terms or else we'll be destroyed" is an unusually extreme belief, of the sort usually held by countries which just lost a war.
Patel may have broken some ministerial code, and that's bad. But the former Prime Minister and her cabinet said 108 times the country would leave the EU on the deadline with or without a deal and they were lying every single time. The cabinets before that told voters they were committed to bringing down immigration, but after leaving government Osborne admitted the cabinet never believed in their stated goal, didn't want to do it and therefore just ignored it. That sort of blatant, knowing manipulation is far, far worse and completely destructive to trust in politics. Meeting Israelis without filing the right paperwork is trivial compared to it.
Unless there is an election and the Tory party are returned with increased majority, it is certain that the current prime minister will have as little success in the house as his predecessor. They have a majority of 3, soon to be 2, held up by the DUP. There are enough moderates left in the Tory party to lose the government a vote on every problematic exit scenario, and on a no-deal exit, just as was the case for Theresa.
> Meeting Israelis without filing the right paperwork is trivial compared to it
No, it was worthy of dismissal. She preempted that by resigning. She was not Foreign Secretary or Prime Minister. It was not her role to make policy on the hoof on a topic irrelevant to her office. She blew her chance to explain by continuing to leave out some of the meetings, resulting in a second summons to Downing St. She does not deserve to return to senior office.
In the end, the current cabinet is much more likely to try and implement the Conservative's actual manifesto. The only reason you describe that as immoderate and extreme is you want them to fail to do so. It wouldn't be considered so if "UK" and "EU" were replaced with random tokens and presented to people who weren't invested in British/EU politics.
Moderate and immoderate are well known and defined political positions completely unrelated to how you are attempting to redefine them.
I'm still shocked that we have someone as Home Secretary who thought capital punishment was a good idea.
Are we going to now have three verdicts?
No, we really mean it this time, actually properly guilty.
Her unapproved meetings with foreign states should be enough to warrant her resignation but how she continue baffles me.
Even if it it was possible I think the bigger question is do we want to live in a society where any and all conversations can be ease-dropped on? I get the point that they want it for investigations, but its been proven over and over that if there is a way it will be abused.
Would intelligence and LE also be ok with that same rules applying to them?
Really the fact intelligence and law enforcement agencies are lobbying is actually utterly fucked up. Their purpose is to serve us not the other way around.
If people with actual sense were in charge the fuckers pushing for it would be fired and out the door so fast it breaks the sound barrier - actively undermining that which the nation benefits from the most economically and making them weaker to attackers - all while not making adversaries weaker? That is inexcusable incompetence.
However "we" want them to be able to "stop the bad guys" and "monitor bad communication". "We" also have nothing to hide.
This yougov poll shows more Americans support backdoors in encryption than oppose it
The last question does show more "favor" than "oppose" for installing back doors in encrypted systems, but the first question shows much more "oppose" than "favor" for reducing encryption to help government agencies. The second question shows that more people want tech companies to protect customer privacy than to cooperate with government agencies to fight terrorism and crime.
So people want encryption back doors that don't reduce encryption and don't require tech companies to cooperate with government agencies. Of course "we can have all the good stuff and none of the bad stuff" is a common delusion among government agencies proposing encryption back doors too.
c.f. also the recent Australian prime minister who claimed that legislation can override mathematics.
I believe the phrase is PI IS EXACTLY 3
And that's where democracy falls down
This is appropriate, under the assuption of accountability; right now, three letter agencies aren't subject to it.
> "We" also have nothing to hide.
This is disingenous or naive (and it's a worringly widespread idea). Literally (as in literal-literal) anybody can be accused and charged, it's just a matter of legal power¹. Giving up privacy makes it dangerously easier.
¹=There's even a book on this subject (although the angle is not precisely this): https://www.amazon.com/Three-Felonies-Day-Target-Innocent/dp...
In reality these are massive organisations of people who want to do good and protect people from actual dangers and repeats of actual harmful incidents. So I think framing the motives as malevolent isn’t helpful because the motives aren’t malevolent.
I think it’s much more reasonable to ask why these things arise. Eg maybe the government says “how will you stop something like x happening again” and they say “well it would have been really hard to detect but we were slightly suspicious of them. If only we could get a warrant to find out what they were talking about...”. And this probably seems reasonable to the minister who still thinks these intelligence agencies are steaming open letters or tapping into phone lines.
It doesn’t even need to be the case that people know these laws would work/be useful, all they need is to feel that they would. And this can quite easily happen without any malicious intentions.
Other things one could imagine happening are finding warrants annoying because they feel like a formality and feeling that the pause in the process potentially causes harm. Or seeing the whole “I ask my ally to spy on my citizens” process as a silly way to get round an annoying loophole. I can imagine something like this happening in a multinational company and if you see intelligence allies as actually working together in a team it doesn’t seem so crazy to see it as a silly legal formality to allow the actual teamwork. So (to say the same thing again) I don’t think these things arise from bad intentions.
A final thing is that many people in these intelligence organisations seem to care about how this surveillance is done in an ethical way (although some people don’t). Eg note here that they want to get this ability with a warrant (perhaps they really want it warrantless and plan to get it or perhaps they feel like they were burned by the various revelations and don’t think they could get it anyway).
Compare this to the way much of the modern mass surveillance we are exposed to every day is planned where there is virtually no ethical oversight at all.
You'd have to be super naive or ignorant of history to think any different. Even if you trust our current regime, you never know what could happen in the future. It's just too much potential harm in exchange for the convenience of simply not having to do targeted spying instead of mass surveillance.
Nobody likes terrorists but mass surveillance is just way too open to abuse.
I'm going to defend what is probably the minority opinion on this site and say yes, I'd rather live in a society where communication is open to surveillance.
The reason being that the situation appears to me very binary (and I think most people would agree on this), either there's strong encryption in which case almost all communication is not subject to surveillance, or the state has the capacity to eaves-drop.
The first scenario scares me because it essentially eliminates the ability to engage in surveillance when it is needed. Be it financial fraud on a wide scale, terrorism, crime, radicalisation or whatever else, and society has a vested interested in having the capacity to prevent this.
I don't think the two most cmmon criticisms hold up. The first one is that surveillance affects many people adversely. I don't think that's true. Nobody has an interest in eaves-dropping on average citizens, it's simply a waste of resources. The second one is the slippery slope line of argument you brought up. I don't think there is a lot of evidence that, in states of law, surveillance has been abused or employed illegaly.
What about in the current case of the Nicaraguan government?
I think it's valid to say that governments like these do abuse surveillance, but my problem with this as an argument in these discussions is that it also equally applies to anything else. They abuse the power of police, of the military, of state owned enterprises and anything else, but yet in other nations we still rely on all of these facilities to a large degree.
So I think there should be a distinction between problems intrinstic to surveillance, and bad actors using surveillance as a tool for abuse.
I was thinking more along the lines of targetting of political dissidents, building unlawful programs, arresting people on the basis of information not lawfully collected, and so on.
The Snowden revelations brought this topic up when it comes to US intelligence abroad, but I don't think such violations ever occured inside the US. (or respective western countries, say).
Or let's go old school. Remember good ole' J. Edgar Hoover? Even before the digital age, the man created enough waves through amplified access via HUMINT that cast a pallor of doubt over decades of politics.
There is absolutely no rational reasoning to endorse further centralization to enable systemic abuse. No privileged system will remain free of abuse in the face of those seeking ultimate power. The only way to prevent those seeking it from finding it is to identify the power grab when it happens, and shut it down.
I absolutely do not condone the Four Horsemen of the Infocalypse, but let me be clear; a world with pervasive and perfect surveillance is a world where the machinery, if left alone to it's own devices inevitably tends toward the destruction of our humanity.
I know the religious language may not carry As much weight for some, but I think in this rare case it's the most concise way I can make my point. As sinners, we are born, through forgiveness, repentance, and redemption we grow and cultivate that which as a species is said to entail all that which is generally regarded as being beautiful, and virtuous about us. Part of that, as a societal unit, comes in the form of fighting the most vile amongst us without abandoning the moral high ground.
We don't have police procedures and the rules around dispensing justice because we want above all else to punish criminals; if we were really out to do that, just point the mob of the majority at everyone they disagree with or deem a criminal and 'let God sort it out' as the most ruthlessly pragmatic would say.
On the contrary, though; we make it so hard to police, we constrain acceptable methods of investigation, because at some level we all understand the violence inherent to the system, and the inevitability of the occasional employment of it. We rein it that we may in some manner drive it, and live not in constant fear of those that drive it; as they too are (supposed to be) bound by it's laws.
It should strike a tone of alarm in anyone when those acting as the executors of the system's mandates begin fighting to loosen the rein. In no case does that bode well for anyone involved left unchecked.
Intelligence agents willingly sacrifice much of their personal privacy as a condition of their security clearance, so by selection bias would be more willing to subject others to loss of privacy.
> its been proven over and over that if there is a way it will be abused.
Is there actually any good, cite-able instance of government backdoors being abused? I believe it is possible, but i don't know of any instance of it happening.
They seem to unlock most of this info though by attacking the endpoint itself anyway.
Last year, my own country (Australia) passed a law which allows the government to force companies or even individuals to add backdoors to their products, and makes it a criminal offence to refuse or publicly disclose their requests. I would go to jail before I complied.
For those of you in other five eyes countries, you'll have similar laws soon too. Our intelligence agencies have clearly set themselves up against fundamental principles of human rights, and their efforts to undermine these must be fought.
I think the tech media and community overstates the impact of this law. The law  makes it clear that the backdoor cannot introduce any systematic weakness of vulnerability, which explicitly includes "a new decryption capability in relation to a form of electronic protection".
What it allows is stuff that targets a specific person _and_ is incapable of affecting anybody else. The second part overrides the first part, so if it's not possible to target a specific person without weakening protection for everybody else, you're not required to do anything.
For example asking you to put code into your app that creates a copy of private keys and sends them to ASIO if the user's ID matches a hard-coded value would be legally okay per my reading of the law.
However adding ASIO's key to every single message would not be okay.
I'm not saying I'm in favour of the law (I'm not) but its actual effect isn't at all what people assume (I hear a lot of comments about "Australia banned encryption" and other such nonsense).
You can hold all the meetings you want. pound fists to table, elegantly restate your problem, but the mathematic fundamentals of it are your immovable object. your only option is to block it throughout your nation. this just makes room for a new, or an updated version of the fly you swat last week that gets around your flyswatter.
Sure, you can try to poison the code base, or inject some kind of malware, but this trick only works once. its not a silver bullet.
Given that the maths is "out of the bag", any motivated criminal organisation or group that is intent on not being caught can quite easily encrypt their own communications. The only people who won't are the innocent public, who can be spied on with impunity.
It's always the innocent public who is the target of such moves. The goal is state omniscience, not crime fighting.
Competent criminal organisations wouldn't care about laws banning encryption, and would know to use the proper tools.
The random non-competent criminals caught this way, would be used to justify the measure...
Of course governments cannot stop tech people from using e2e encryption. That's not what bothers them. What bothers them is that e2e encryption is the default. They want to change the default to be insecure.
The people who demand these laws are tasked with making various statistics change, such as crime rates. They are metaphorical paperclip maximisers: a surveillance state is not a goal, it is just a means to an end, a way to make their numbers look better.
The real goal is to expand the state. Bureaucracies, like cancer, want to expand into any aspect of their subjects lives they can. They get job security, bigger budgets, and more power as they expand, what's not to like?
Individual opinions (of politicians, bureaucrats, etc) don't really matter, as the aggregate dynamics of a state/organization tend to expansion and self-preservation -- even if an organization has no real role or is mostly BS (eg TSA).
Besides, nothing wrong with being cynical. Remember McCarthy? The real goal of the collection of data then wasn't to "catch stupid criminals". They collected data on politicians, journalists, businessmen, etc. Same thing with now public records of 10 and 20 years later -- check the files on people like MLK, Lennon, civil rights leaders, etc.
(And of course anybody with experience from European governments like in Italy, Greece, Spain, France, and so on, knows that private info/surveillance is used against whom which the state considers an "internal enemy" all the time, and in fact collection on those was far more common up to the 90s than collection on criminals -- which merely had their criminal record, and periods when they were bugged in suspicion of a crime, whereas politicians, businessmen, activists, writers, journalists etc were tracked all the time).
When they say (not in so many words) that "I want for us to be able to bulk monitor communications", I'm not willing to concede that they aren't considering or aware of some of the possible second order effects.
To me it seems more likely that they feel they (may) get to move their crime rate statistics and, when their mind goes for a cursory wander over the likely consequences, they find all of those to be mildly pleasant as well.
Then you confiscate devices and jail people when you believe (or have "reasonable suspicion") that they use one.
Problem solved for the 99% of the population!
This isn't p*rnhub. You can't backdoor everything.
Why would you censor pornhub?
Those are who they want to spy on anyway.
For serious criminals, terrorists etc they have other tools, banning commercial encryption wont help with those...
Smart bad guys get away with it.
Being able to read whatsapp would help us catch many more stupid bad guys.
Smart bad guys will always be able to get away with it. That doesn't mean we should stop trying to catch stupid bad guys.
And in this case, being more technically smart might just mean clicking the link to the E2E encrypted web chat site rather than the server-to-client encrypted site. Perhaps, though, the government will start banning websites that offer E2E encrypted chat, and require hosting companies to not let you host such apps yourself.
This entitlement is obscene.
If backdoor access is granted then a new set of heads will emerge from the hydra.
Dealing with the challenges faced by having a mere 97% conviction rate, federal prosecutors and law enforcement conspire with foreign powers to remove pesky civil liberties.
> By choosing a simple but strong cipher that is already widely published and agreeing on how to use it, anyone with elementary programming skills can write their own encryption program without relying on any products that can be banned.
And Pontifex, aka the Solitaire Encryption Algorithm (SPOILER ALERT):
> In Neal Stephenson's novel Cryptonomicon, the character Enoch Root describes a cryptosystem code-named "Pontifex" to another character named Randy Waterhouse, and later reveals that the steps of the algorithm are intended to be carried out using a deck of playing cards.
Laws cannot stop encryption, they can stop law-abiding people from using it maybe but not criminals.
Also, when you implement the ciphersaber, you're still only about 1/4 of the way to the functionality of early-1990s PGP, notably lacking any public key functionality.
> For file encryption, a user need only memorize one key or passphrase. For messaging, users need to exchange pairs of keys through some secure means, most likely in person. Maintaining a list of correspondent's keys or passphrases in a master file, preferably itself encrypted with a memorized master key, is less convenient than public key encryption. But it may be all that is left in a few years if PGP key servers are banned.
> It may even be possible to teach a manual version of the Diffie-Hellman key exchange, perhaps using large number calculators (easily built in Java 1.1). The Diffie-Hellman procedure need be carried out just once per pair of correspondents, since CipherSaber eliminates the need to exchange keys for every message.
Apart from the implausibility of some of this, you have a very severe issue about key synchronization if you literally only want to do a key exchange once. For example, an attacker who can intercept one party's message and then trick another party into encrypting a known plaintext with the same key material (because that party doesn't know that the keystream has advanced yet?) can then decrypt the intercepted message.
Even having the two users accidentally use the same part of the keystream to send separate unknown messages m₁ and m₂ will allow an adversary to compute m₁⊕m₂, which is very bad in many cases. One thing I remember from Dan Boneh's cryptography class is that if either message contains an ASCII space character (' ') at some position, then m₁⊕m₂ will contain the other message's plaintext with uppercase and lowercase inverted (for example,' '^'q' is 'Q').
The ciphersaber idea is conceptually really great, and I love the idea of helping teach people to create their own communications and communications security infrastructure. But I think that, apart from just how archaic the cryptographic technology it teaches is, the project really underestimates how far away this cipher implementation is from a complete system.
It is also likely that the technology they use to solve that problem will be much less sophisticated.
Well, one of the four boxes, anyway.
It's not the tool people, it's how you're using it.