Hacker News new | past | comments | ask | show | jobs | submit login
'Five Eyes' nations discuss backdoor access to WhatsApp (theguardian.com)
188 points by humantiy on July 30, 2019 | hide | past | favorite | 126 comments

> Dealing with the challenge faced by increasingly effective encryption

They weren't able to spy in bulk when communication was primarily offline, and they won't when it's primarily encrypted.

Don't let them frame the brief, anomalous period when they could listen in on everyone, as 'normal'.

This. That entire period should be examined as a lapse in judgement, not a time when things were better.

Because of our state-of-the-art security, we're now able to do more things online in less secure environments. A secure, distributed internet is normal. One that is insecure by design is not.

Even HTTPS was costly in terms of resources back then, thankfully hardware acceleration and better algorithms came and there are no valid reasons anymore not to encrypt communications now.

They brought it up themselves too https://www.youtube.com/watch?v=xozVBAWo8XI (not dubbed unfortunately) where their users would accuse them of being antiquated for not doing any server sync'ed messages but it's all by intent.

proper security has always been "state-of-the-art"

Even burner phones were a technically state-of-the-art which was a story which underpinned an entire season of a TV show's plotline (and title).

Cellphones became so cheap and widely available you could buy and throw them away efficiently to not get easily tapped and still make enough money, even as a low level drug dealer living in low income neighbourhoods. That wouldn't have been possible years prior.

Burner phones are good for anonymity (if you buy them anonymously), but I don't think that they add any security

Really good point. I'm amazed we don't hear this argument more, and also the related one that we always used to have an anonymous payment system (cash) so why is a digital equivalent unthinkable?

Historically, intelligence services did routinely open and read physical letters sent through the mail, on bulk scales.

Currently the US scans and saves an image of the front and back of every single piece of postal mail.

Yeah, but that happened in places like the USSR, not in the free world.

It happened in premodern Europe, in places like Switzerland.

It's not a 20th-century phenomenon by any means.

In the inter-war years and through WW2 there was a GPO Special Investigations Unit in every sorting office.

They were steaming open, photographing mail, then sending them on, and the photos to MI5. They were also responsible for recording phone calls so had presence in major exchanges.


Smart speakers and IoT might make the future a lot worse in terms of privacy.

Also those Ring doorbell cameras that are being watched by over 200 police departments.

The historical norm also included pretty overbearing social surveillance of movements and contacts. I think you'd have an easier time getting a conspiracy past the NSA than getting it past the town elders.

What does this have to do with "in bulk"? Seems like you're the one misrepresenting the situation.

In bulk, as opposed to targeted spying - you can send an agent to hide behind the bushes, or plant a microphone, or infiltrate a group. Which was possible for a long time before computers or electronics (minus the microphone example), but it's not possible to do it at scale - you can spy on a few hundred people this way, but not on a few million.

In fact those techniques have even gotten substantially better with technology. It is fairly easy to create a laser microphone that can pick up sound on the other side of glass. There are even devices that can do it through solid walls, but sensitivity is vastly different between these. These are huge advantages though because you don't need to get close to the target.

We have telephoto lenses and a 50 megapixel camera is only $4k. Imaging has gotten both substantially cheaper and substantially better.

We also now have IMSI-catchers that can easily be deployed. Which makes wire tapping much easier.

By all accounts everything that could be done in the past can be done substantially better and for much cheaper while having a lower risk of an operative being caught by trying to place the spying device (since we can increase the distance). Which also means the term bulk has changed.

I'm sure reading encrypted message would make their job easier too. But I'd need to see some hard data on how effective advancements in other areas have enabled catching terrorists, because I highly doubt they have.

Also, backdoors seem self sabotaging. If you have operatives in another country don't you want them to have easy access to encrypted communication lines? If only spies use SuperSecretCommunicationApp then that's pretty easy to trace. I've understand this to be the whole reason for releasing Tor to the public. Gives a lot of cover. Besides the fact that your enemies are also going to get ahold of any backdoor created.

Targeted spying is limited by manpower, not by technology.

If undercover agents can work mostly from the comfort of their home or office because they operate online, they might be able to spend time on other activities, but not to infiltrate two organizations at once.

If some speech recognition AI finds interesting bits in conversations so that the same analyst is ten times faster at examining wiretapping output, it doesn't mean catching terrorists ten times faster.

Listening to wiretaps is only a small part of the work and Amdahl's law applies; moreover more data and better data analysis tools tend to improve quality, not to reduce effort (in this case, it becomes affordable to snoop on ten times as many people).

> they might be able to spend time on other activities, but not to infiltrate two organizations at once.

This kinda contradicts itself. Maybe the answer isn't 2x but like 1.3x or something. But that's all my point. That it makes it easier.

No one is saying that it's a linear or exponential relationship. I'm sure everyone here would expect the effectiveness to be sublinear.

The problem though is that the number of people being snooped on isn't 10x. Not even 1kx. More like 1e6x. With that much more snooping if we aren't getting terrorists 10x faster (which let's be real, that's poor gain), I don't know how this is even remotely justifiable (maybe a good argument could be made at 1000x catch rate, but that's leaving out moral questions).

If you're going to talk about Amdahl, let's talk about Pareto. We know that catch rate is going to have a logarithmic catch rate compared to energy spent. Most terrorists will be easy to catch. The last 0.001% will take a ton of resources to catch. This is a key part to Amdahl's. You don't just throw more and more resources at the problem. You don't gain at a certain point and are wasting resources. If we're spending all that money, time, and sacrificing all that freedom just to scrape the bottom of the barrel, then I'd argue that this isn't a good use of resources. That's the issue at hand here. Amdahl's Law is the issue. People don't care about targeted surveillance. People care about mass surveillance that doesn't meaningfully improve the catch rate. It's a waste of money and an over reach in control.

But again, if you can show me data to suggest that the catch rate is much better then I'm open to changing my mind. But I'm highly skeptical that this is the case, because it doesn't match the intuition of the above principles.

What about asking companies to voluntarily limit their encryption to bit depths that are crackable with huge resources, but to avoid anything stronger than that? Then you don't even need a backdoor.

This means government can't do it in bulk, and it would out of the reach of an ordinary individual, but it would still be possible.

Then in five or ten years the computing resources needed to crack that encryption would be available at the corner store.

So adjust the encryption level as time goes on. Don't make the policy the number of bits, rather set the amount of computation needed.

This meeting was about targeted spying and not bulk collection.

They don't need backdoors for targeted spying - they can plant hardware bugs or hidden cameras to watch the target type in messages and passwords. So the only reason they'd want them is to use them on scales where targeted spying is infeasible.

How will people know it's not done in bulk? The only thing that the public sees is the client binary. A client binary with a backdoor looks the same whether the backdoor is used one time in a million or thousand times, or a million times. The companies might publish reports on how often they got asked to access the backdoor but they might be compelled to lie or even be hacked and not notice it, or they might even use the backdoor themselves to extract more revenue.

When surveillance is being rolled out, it's always just about a few single cases here and there. And then gradually, silently, the number increases.

The pretext of the meeting (and others like it) notwithstanding, it's not unfair to say that strong encryption renders communication to a state closer to the pre-digital era in terms of the amount of work required to perform individual surveillance. I think that was the point deogeo was making, and it seems like a sound one to me.

The only difference between targeted computerized spying and bulk computerized spying is a for() loop.

This is the line the Australian anti-encryption bill that past last year was trying to walk with its "no systemic backdoors" clause. The idea/claim was that the bill only allowed targeted spying, and systemic/bulk spying, so it's okay. But that's really only only a small part of it.

So why do they need bulk spying capacity for targeted spying?

The argument makes no sense.

I think the idea is that encryption doesn't prevent all eavesdropping (since the client could be hacked) but it makes it difficult to do at scale.

Some info about disgraced former International Development Secretary Priti Patel, who is the UK's new Home Secretary (a position that looks after home affairs such as crime, security, terrorism, immigration, citizenship).

- She resigned from her previous position as International Development Secretary in 2017 when it was discovered she held secret unauthorised meetings with Israeli officials and lied about it. The meetings were not sanctioned by the Foreign Office and were a breach of ministerial code.

- A supporter of Brexit, she suggested last year that the UK leverage the prospect of food shortages in Ireland in order to gain a better Brexit deal. Although, she quickly back-pedalled on her comments, she was rightly criticised for her remarks.

The depressing reality is that the current Conservative Party in the UK is stuffed to the rafters with nasty politicians just like her.

Priti Patel's voting record in parliament: https://www.theyworkforyou.com/mp/24778/priti_patel/witham/v...

"Generally voted for requiring the mass retention of information about communications"

"Voted for mass surveillance of people’s communications and activities"

Not to forget that immediately on being appointed Home Secretary she was accused of breaking the ministerial code yet again, in May! It's currently the only significant point under the Home Secretary section of her Wikipedia page:


Most of the reasonable and moderate members of the Tory party are on the back benches, leaving or about ready to retire. The old, reasonable, one nation Tory party is dead as a dodo.

The alternative view is that this is the first reasonable and moderate cabinet since the referendum.

Consider that it's the first one which appears to be serious about actually doing what the government repeatedly said it would do, both before and after the vote. A government doing what it promised it would do is reasonable. It is led by a man who wants very much to reach an acceptable deal with the EU, but will leave without one if the EU makes it necessary. That's a reasonable and moderate position of the sort that millions of business leaders take every single day.

The previous cabinet had a position like this: we're saying we'll leave no matter what, but we're lying because we definitely won't ever leave without a "deal" of some sort, which basically means the party we're negotiating with can propose whatever terms they like and we'll always accept them regardless of how terrible. Thus an "agreement" which is universally regarded as awful is presented as the only possible path forward, other than ignoring the biggest vote in British history. That's not at all a reasonable way to go about negotiations or politics. Nor is it even slightly moderate - "we must accept terrible terms or else we'll be destroyed" is an unusually extreme belief, of the sort usually held by countries which just lost a war.

Patel may have broken some ministerial code, and that's bad. But the former Prime Minister and her cabinet said 108 times the country would leave the EU on the deadline with or without a deal and they were lying every single time. The cabinets before that told voters they were committed to bringing down immigration, but after leaving government Osborne admitted the cabinet never believed in their stated goal, didn't want to do it and therefore just ignored it. That sort of blatant, knowing manipulation is far, far worse and completely destructive to trust in politics. Meeting Israelis without filing the right paperwork is trivial compared to it.

Whatever the former prime minister, or the current prime minister say, it only becomes so once it is voted into being. When you have a majority of 3, and a controversial policy, prior to vote it's just aspiration. If Theresa was lying, Boris is likely to be lying. Neither brought good governance.

Unless there is an election and the Tory party are returned with increased majority, it is certain that the current prime minister will have as little success in the house as his predecessor. They have a majority of 3, soon to be 2, held up by the DUP. There are enough moderates left in the Tory party to lose the government a vote on every problematic exit scenario, and on a no-deal exit, just as was the case for Theresa.

> Meeting Israelis without filing the right paperwork is trivial compared to it

No, it was worthy of dismissal. She preempted that by resigning. She was not Foreign Secretary or Prime Minister. It was not her role to make policy on the hoof on a topic irrelevant to her office. She blew her chance to explain by continuing to leave out some of the meetings, resulting in a second summons to Downing St. She does not deserve to return to senior office.

That may well be the case, but May didn't have to ask for an extension to Article 50 or have a man as her Chancellor who was totally against no deal. She chose to do those things. And it was - to nobody's surprise - later revealed that she never even brought up the possibility of no deal with her EU counterparts.

In the end, the current cabinet is much more likely to try and implement the Conservative's actual manifesto. The only reason you describe that as immoderate and extreme is you want them to fail to do so. It wouldn't be considered so if "UK" and "EU" were replaced with random tokens and presented to people who weren't invested in British/EU politics.

No one in their right mind would expect them to mention the possibility of no-deal - it's the idiotic option to be avoided. Mentioning it or threatening it in negotiations is brinkmanship of the worst kind, as a no-deal is more damaging for us than the EU.

Moderate and immoderate are well known and defined political positions completely unrelated to how you are attempting to redefine them.

I recommend watching this clip from Question Time on the subject of capital punishment:


I'm still shocked that we have someone as Home Secretary who thought capital punishment was a good idea.

"ultimate burden of proof"??

Are we going to now have three verdicts?

Not guilty


No, we really mean it this time, actually properly guilty.

Terrible, cons have always been the nasty party and will continue this legacy.

Her unapproved meetings with foreign states should be enough to warrant her resignation but how she continue baffles me.

Good social connections.

>The controversial so-called “ghost protocol” has been fiercely opposed by companies, civil society organizations and some security experts – but intelligence and law enforcement agencies continue to lobby for it.

Even if it it was possible I think the bigger question is do we want to live in a society where any and all conversations can be ease-dropped on? I get the point that they want it for investigations, but its been proven over and over that if there is a way it will be abused.

Would intelligence and LE also be ok with that same rules applying to them?

The answer is no of course - they have the thought terminating cliche of "national security" to protect against accountability.

Really the fact intelligence and law enforcement agencies are lobbying is actually utterly fucked up. Their purpose is to serve us not the other way around.

If people with actual sense were in charge the fuckers pushing for it would be fired and out the door so fast it breaks the sound barrier - actively undermining that which the nation benefits from the most economically and making them weaker to attackers - all while not making adversaries weaker? That is inexcusable incompetence.

> Their purpose is to serve us not the other way around.

However "we" want them to be able to "stop the bad guys" and "monitor bad communication". "We" also have nothing to hide.

This yougov poll shows more Americans support backdoors in encryption than oppose it



At first I thought that you linked to the wrong poll by mistake.

The last question does show more "favor" than "oppose" for installing back doors in encrypted systems, but the first question shows much more "oppose" than "favor" for reducing encryption to help government agencies. The second question shows that more people want tech companies to protect customer privacy than to cooperate with government agencies to fight terrorism and crime.

So people want encryption back doors that don't reduce encryption and don't require tech companies to cooperate with government agencies. Of course "we can have all the good stuff and none of the bad stuff" is a common delusion among government agencies proposing encryption back doors too.

Yes, since the operational outcomes of the first and last questions are the same, the main notion this poll really confirms is that people in general don’t understand cryptography.

c.f. also the recent Australian prime minister who claimed that legislation can override mathematics.

> c.f. also the recent Australian prime minister who claimed that legislation can override mathematics.

I believe the phrase is PI IS EXACTLY 3

According to https://en.wikipedia.org/wiki/Indiana_Pi_Bill the value of pi would have been defined as 3.2 in Indiana.

I mean, when you the word question like that and say the backdoor is to “monitor suspected terrorists” of course the average idiot will agree. They think it won’t apply to them & their communications. They aren’t terrorists, so why should they mind? Most people have no idea how far the government is able to stretch the law under the auspice of “safety.” Perhaps naively, I believe people would be outraged if they actually understood how the Patriot Act is used and what a backdoor would allow the government to do.

> when you the word question like that and say the backdoor is to “monitor suspected terrorists” of course the average idiot will agree

And that's where democracy falls down

The funny thing is, I bet 90% of the people supporting backdoors here also think China's government monitoring is a gross violation of human rights.

Other funny thing on that is if you are a citizen of one of the five eyes and china was putting a backdoor into phones it would probably wouldn't matter too much since they have little influence on you unlike say your own gov having that same capability and using it.

> However "we" want them to be able to "stop the bad guys" and "monitor bad communication".

This is appropriate, under the assuption of accountability; right now, three letter agencies aren't subject to it.

> "We" also have nothing to hide.

This is disingenous or naive (and it's a worringly widespread idea). Literally (as in literal-literal) anybody can be accused and charged, it's just a matter of legal power¹. Giving up privacy makes it dangerously easier.

¹=There's even a book on this subject (although the angle is not precisely this): https://www.amazon.com/Three-Felonies-Day-Target-Innocent/dp...

I think it’s normally wrong to pose these issues as intelligence services trying to come up with new ways to oppress the population. It requires some kind of conspiracy either of the whole organisation or a conspiracy of the highest levels to trick the rest of the organisation into oppressing the population.

In reality these are massive organisations of people who want to do good and protect people from actual dangers and repeats of actual harmful incidents. So I think framing the motives as malevolent isn’t helpful because the motives aren’t malevolent.

I think it’s much more reasonable to ask why these things arise. Eg maybe the government says “how will you stop something like x happening again” and they say “well it would have been really hard to detect but we were slightly suspicious of them. If only we could get a warrant to find out what they were talking about...”. And this probably seems reasonable to the minister who still thinks these intelligence agencies are steaming open letters or tapping into phone lines.

It doesn’t even need to be the case that people know these laws would work/be useful, all they need is to feel that they would. And this can quite easily happen without any malicious intentions.

Other things one could imagine happening are finding warrants annoying because they feel like a formality and feeling that the pause in the process potentially causes harm. Or seeing the whole “I ask my ally to spy on my citizens” process as a silly way to get round an annoying loophole. I can imagine something like this happening in a multinational company and if you see intelligence allies as actually working together in a team it doesn’t seem so crazy to see it as a silly legal formality to allow the actual teamwork. So (to say the same thing again) I don’t think these things arise from bad intentions.

A final thing is that many people in these intelligence organisations seem to care about how this surveillance is done in an ethical way (although some people don’t). Eg note here that they want to get this ability with a warrant (perhaps they really want it warrantless and plan to get it or perhaps they feel like they were burned by the various revelations and don’t think they could get it anyway).

Compare this to the way much of the modern mass surveillance we are exposed to every day is planned where there is virtually no ethical oversight at all.

Those are all great points butit's still unbelievably scary to think of a government that is storing all digital communications of all its citizens forever.

You'd have to be super naive or ignorant of history to think any different. Even if you trust our current regime, you never know what could happen in the future. It's just too much potential harm in exchange for the convenience of simply not having to do targeted spying instead of mass surveillance.

Nobody likes terrorists but mass surveillance is just way too open to abuse.

To be clear, I’m not saying that these changes are good or that they can’t/won’t be used to oppress. I merely want to say that they aren’t designed and planned to do that and so I don’t think it’s helpful to frame arguments about it as fighting against tyranny.

>Even if it it was possible I think the bigger question is do we want to live in a society where any and all conversations can be ease-dropped on? I get the point that they want it for investigations, but its been proven over and over that if there is a way it will be abused.

I'm going to defend what is probably the minority opinion on this site and say yes, I'd rather live in a society where communication is open to surveillance.

The reason being that the situation appears to me very binary (and I think most people would agree on this), either there's strong encryption in which case almost all communication is not subject to surveillance, or the state has the capacity to eaves-drop.

The first scenario scares me because it essentially eliminates the ability to engage in surveillance when it is needed. Be it financial fraud on a wide scale, terrorism, crime, radicalisation or whatever else, and society has a vested interested in having the capacity to prevent this.

I don't think the two most cmmon criticisms hold up. The first one is that surveillance affects many people adversely. I don't think that's true. Nobody has an interest in eaves-dropping on average citizens, it's simply a waste of resources. The second one is the slippery slope line of argument you brought up. I don't think there is a lot of evidence that, in states of law, surveillance has been abused or employed illegaly.

> I don't think the two most cmmon criticisms hold up. The first one is that surveillance affects many people adversely. I don't think that's true. Nobody has an interest in eaves-dropping on average citizens, it's simply a waste of resources. The second one is the slippery slope line of argument you brought up. I don't think there is a lot of evidence that, in states of law, surveillance has been abused or employed illegaly.

What about in the current case of the Nicaraguan government? https://www.hrw.org/world-report/2019/country-chapters/nicar...

I think the Ortega government falls far outside the scope of something we can call a state of law, which I qualified my comment with.

I think it's valid to say that governments like these do abuse surveillance, but my problem with this as an argument in these discussions is that it also equally applies to anything else. They abuse the power of police, of the military, of state owned enterprises and anything else, but yet in other nations we still rely on all of these facilities to a large degree.

So I think there should be a distinction between problems intrinstic to surveillance, and bad actors using surveillance as a tool for abuse.

Well we can qualify things to our argument's benefit all day, but the underlying point of trusting that a government will always be a "good actor" is a proven flawed premise.

I don't think one needs to believe tha the government is always a good actor. The question is if the benefits of surveillance to say, public safety and order will outweigh the likelyhood of bad outcomes or abuse. That's not a trivial question, and it differs strongly depending on which country we're talking about.

Here is some evidence that in states of law like the US, surveillance has been abused. [1]

[1] https://www.privateinternetaccess.com/blog/2016/09/police-ro...

I don't think that falls into the category of surveillance in the context of encryption, because individual police officers abusing privileged databases will still be a thing in a perfectly encrypted world, unless we take all information away from official agencies.

I was thinking more along the lines of targetting of political dissidents, building unlawful programs, arresting people on the basis of information not lawfully collected, and so on.

The Snowden revelations brought this topic up when it comes to US intelligence abroad, but I don't think such violations ever occured inside the US. (or respective western countries, say).

So, you're saying you don't know about the DEA's use of parallel construction as outlined in their operating manual? Or, the FISA courts, or the various Love-Int scandals the NSA engaged in?

Or let's go old school. Remember good ole' J. Edgar Hoover? Even before the digital age, the man created enough waves through amplified access via HUMINT that cast a pallor of doubt over decades of politics.

There is absolutely no rational reasoning to endorse further centralization to enable systemic abuse. No privileged system will remain free of abuse in the face of those seeking ultimate power. The only way to prevent those seeking it from finding it is to identify the power grab when it happens, and shut it down.

I absolutely do not condone the Four Horsemen of the Infocalypse, but let me be clear; a world with pervasive and perfect surveillance is a world where the machinery, if left alone to it's own devices inevitably tends toward the destruction of our humanity.

I know the religious language may not carry As much weight for some, but I think in this rare case it's the most concise way I can make my point. As sinners, we are born, through forgiveness, repentance, and redemption we grow and cultivate that which as a species is said to entail all that which is generally regarded as being beautiful, and virtuous about us. Part of that, as a societal unit, comes in the form of fighting the most vile amongst us without abandoning the moral high ground.

We don't have police procedures and the rules around dispensing justice because we want above all else to punish criminals; if we were really out to do that, just point the mob of the majority at everyone they disagree with or deem a criminal and 'let God sort it out' as the most ruthlessly pragmatic would say.

On the contrary, though; we make it so hard to police, we constrain acceptable methods of investigation, because at some level we all understand the violence inherent to the system, and the inevitability of the occasional employment of it. We rein it that we may in some manner drive it, and live not in constant fear of those that drive it; as they too are (supposed to be) bound by it's laws.

It should strike a tone of alarm in anyone when those acting as the executors of the system's mandates begin fighting to loosen the rein. In no case does that bode well for anyone involved left unchecked.

Does this count as along the lines of targeting political dissidents?[1]

[1] https://www.nbcnews.com/politics/immigration/u-s-officials-m...

> Would intelligence and LE also be ok with that same rules applying to them?

Intelligence agents willingly sacrifice much of their personal privacy as a condition of their security clearance, so by selection bias would be more willing to subject others to loss of privacy.

Also, there is no exception in the law for LEOs or intelligence officers, the same rules literally do apply to them.

On paper yes. In practice no.

I wasn't referring to personal so much. More so the conversations they have that they are trying to keep a secret.

So I broadly agree with your point (that 'backdoors are bad'), but I'm both curious, and want to play devil's advocate.

> its been proven over and over that if there is a way it will be abused.

Is there actually any good, cite-able instance of government backdoors being abused? I believe it is possible, but i don't know of any instance of it happening.

Food for thought: Some of the zero days we have heard about could intentionally have been left in there at the behest of the government for their use.

Well there is a middle ground - one where any conversation could be examined after the fact but only with an auditable warrant-based investigation.

Which still requires a backdoor or control agency which is insecure and can be exploited by anyone by grace of existing and given some time/effort.

The same can be said for the organization distributing E2EE messaging app binaries (or source + compiler) even without any official backdoor.

There isn't though unfortunately. If you allow it audit trail or not it can and will be abused. FISA court being an example of that exact type of warrant abuse. Even without a warrant someone would have to hold the master key or keys that any nation state would go to great lengths if they weren't in possession of them and if that did hold it; they will abuse it.

They seem to unlock most of this info though by attacking the endpoint itself anyway.

Ever since Edward Snowden's revelations in 2013, I've had zero sympathy for or trust in any intelligence service, even in purportedly democratic countries.

Last year, my own country (Australia) passed a law which allows the government to force companies or even individuals to add backdoors to their products, and makes it a criminal offence to refuse or publicly disclose their requests. I would go to jail before I complied.

For those of you in other five eyes countries, you'll have similar laws soon too. Our intelligence agencies have clearly set themselves up against fundamental principles of human rights, and their efforts to undermine these must be fought.

One of the scariest parts of this to me is that the vast majority of Aussie developers don't seem to even be aware that such a law was proposed, let along already passed.

This all shows what idiots government parliamentarians are in Australia. If the issue is encryption, then there are very simple ways of using unbreakable encryption systems without relying on asymmetric keys (ie one time pad or Vernam cypher). Granted it will not suit all use cases, since a means of identifying what is the key to use needs be agreed outside of the network processes. That is face to face or via messengers, for instance. But since any file can be used as key (ie music, text, video, object program, etc.) and having a key larger that the text encrypted eliminates repetition patterns, this is a totally unbreakable system. Even quantum computing would totally fail in decrypting a message! It is super simple to implement with a modicum of programming knowledge, it does not require any maths skills! An example of it can be found here: https://gitlab.com/MidGe48/cryptopad I can expand on the means of communicating and sharing keys which are simple and untraceable without requiring ongoing communication after an initial, simple, exchange.

> allows the government to force companies or even individuals to add backdoors to their products

I think the tech media and community overstates the impact of this law. The law [0] makes it clear that the backdoor cannot introduce any systematic weakness of vulnerability, which explicitly includes "a new decryption capability in relation to a form of electronic protection".

What it allows is stuff that targets a specific person _and_ is incapable of affecting anybody else. The second part overrides the first part, so if it's not possible to target a specific person without weakening protection for everybody else, you're not required to do anything.

For example asking you to put code into your app that creates a copy of private keys and sends them to ASIO if the user's ID matches a hard-coded value would be legally okay per my reading of the law.

However adding ASIO's key to every single message would not be okay.

I'm not saying I'm in favour of the law (I'm not) but its actual effect isn't at all what people assume (I hear a lot of comments about "Australia banned encryption" and other such nonsense).

[0]: http://www5.austlii.edu.au/au/legis/cth/consol_act/ta1997214...

What stops the government from just saying I want to target every single specific person that uses your app?

Sections 317JC, 317RA and 317ZAA, which require that the decision makers consider the impact on unrelated people and section 317ZH which requires that a warrant is obtained for things that would usually require a warrant.

There are no unrelated people if that is the ask of the government. What would "usually" require a warrant?

from what ive learned about encryption and cryptography in general, it seems like you dont get to put this cat back in the bag once it gets out.

You can hold all the meetings you want. pound fists to table, elegantly restate your problem, but the mathematic fundamentals of it are your immovable object. your only option is to block it throughout your nation. this just makes room for a new, or an updated version of the fly you swat last week that gets around your flyswatter.

Sure, you can try to poison the code base, or inject some kind of malware, but this trick only works once. its not a silver bullet.

I wrote a blog post about this awhile back, when the UK government was talking about adding backdoors to encrypted messaging platforms, framed as an open letter to our PM. It's a basic introduction to cryptography (I'm no expert though).


Given that the maths is "out of the bag", any motivated criminal organisation or group that is intent on not being caught can quite easily encrypt their own communications. The only people who won't are the innocent public, who can be spied on with impunity.

>Given that the maths is "out of the bag", any motivated criminal organisation or group that is intent on not being caught can quite easily encrypt their own communications. The only people who won't are the innocent public, who can be spied on with impunity.

It's always the innocent public who is the target of such moves. The goal is state omniscience, not crime fighting.

Competent criminal organisations wouldn't care about laws banning encryption, and would know to use the proper tools.

The random non-competent criminals caught this way, would be used to justify the measure...

I wouldn't be so cynical. The real goal is to catch stupid criminals. Not all criminals are stupid, but most are.

Of course governments cannot stop tech people from using e2e encryption. That's not what bothers them. What bothers them is that e2e encryption is the default. They want to change the default to be insecure.

The people who demand these laws are tasked with making various statistics change, such as crime rates. They are metaphorical paperclip maximisers: a surveillance state is not a goal, it is just a means to an end, a way to make their numbers look better.

>I wouldn't be so cynical. The real goal is to catch stupid criminals. Not all criminals are stupid, but most are.

The real goal is to expand the state. Bureaucracies, like cancer, want to expand into any aspect of their subjects lives they can. They get job security, bigger budgets, and more power as they expand, what's not to like?

Individual opinions (of politicians, bureaucrats, etc) don't really matter, as the aggregate dynamics of a state/organization tend to expansion and self-preservation -- even if an organization has no real role or is mostly BS (eg TSA).

Besides, nothing wrong with being cynical. Remember McCarthy? The real goal of the collection of data then wasn't to "catch stupid criminals". They collected data on politicians, journalists, businessmen, etc. Same thing with now public records of 10 and 20 years later -- check the files on people like MLK, Lennon, civil rights leaders, etc.

(And of course anybody with experience from European governments like in Italy, Greece, Spain, France, and so on, knows that private info/surveillance is used against whom which the state considers an "internal enemy" all the time, and in fact collection on those was far more common up to the 90s than collection on criminals -- which merely had their criminal record, and periods when they were bugged in suspicion of a crime, whereas politicians, businessmen, activists, writers, journalists etc were tracked all the time).

While I agree for the most part, these are people who are in positions of power and would like to stay there. They have their personal biases, and their livelihoods depend on their staying in power.

When they say (not in so many words) that "I want for us to be able to bulk monitor communications", I'm not willing to concede that they aren't considering or aware of some of the possible second order effects.

To me it seems more likely that they feel they (may) get to move their crime rate statistics and, when their mind goes for a cursory wander over the likely consequences, they find all of those to be mildly pleasant as well.

Or you can always make it illegal, and no commercial chat program will ship with it.

Then you confiscate devices and jail people when you believe (or have "reasonable suspicion") that they use one.

Problem solved for the 99% of the population!

A government can punish people for using cryptography, punish failure to disclose keys, or treat use of cryptography as circumstantial evidence.

If this works, it won't actually stop strong e2e encryption, it'll just make people download their strong e2e encryption communication apps from non-mass-surveillance states.

This isn't p*rnhub. You can't backdoor everything.

> This isn't p*rnhub.

Why would you censor pornhub?

People generally don't care about LE reading what they say, so no. People used whatsapp before e2e and will keep using it after e2e.

Maybe your "people" are cool with LE reading their salty messages to significant others, or hot take political commentary, but all the "people" I know expect privacy as good responsible citizens should.

One might reasonably assume the "bad guys" they're trying to catch would go elsewhere though, if they have any sense. So then you're just left with innocent people to spy on.

>So then you're just left with innocent people to spy on.

Those are who they want to spy on anyway.

For serious criminals, terrorists etc they have other tools, banning commercial encryption wont help with those...

You'd be surprised to learn how fucking stupid most bad guys are.

We catch lots of stupid bad guys.

Smart bad guys get away with it.

Cynical and correct observation, which I know will get downvoted because it goes against the mantra of this website:

Being able to read whatsapp would help us catch many more stupid bad guys.

Smart bad guys will always be able to get away with it. That doesn't mean we should stop trying to catch stupid bad guys.

What if catching the stupid bad guys just means the smart bad guys take their place? Like a spray that kills 99% of bacteria, all you're potentially doing is applying a selective pressure towards being more technically smart.

And in this case, being more technically smart might just mean clicking the link to the E2E encrypted web chat site rather than the server-to-client encrypted site. Perhaps, though, the government will start banning websites that offer E2E encrypted chat, and require hosting companies to not let you host such apps yourself.

Does nobody realize how inconvenient it is that the relationship between the radius and circumference of a circle cannot be calculated readily by hand? Our manufacturing processes will be greatly improved by silencing those so-called 'mathematicians' and standardizing on a value of 3 for pi not that never ending mess.

"We need to ensure that our law enforcement and security and intelligence agencies are able to gain lawful and exceptional access to the information they need"

This entitlement is obscene.

If backdoor access is granted then a new set of heads will emerge from the hydra.

This just in:

Dealing with the challenges faced by having a mere 97% conviction rate, federal prosecutors and law enforcement conspire with foreign powers to remove pesky civil liberties.

In addition to all the ethical and privacy concerns brought up here, another thing that always crosses my mind is do we even trust these entities to properly store this information? What happens if all data that is being collected from these backdoors is compromised? Think about every private conversation you’ve ever had potentially leaking to the entire world.


> By choosing a simple but strong cipher that is already widely published and agreeing on how to use it, anyone with elementary programming skills can write their own encryption program without relying on any products that can be banned.


And Pontifex, aka the Solitaire Encryption Algorithm (SPOILER ALERT):

> In Neal Stephenson's novel Cryptonomicon, the character Enoch Root describes a cryptosystem code-named "Pontifex" to another character named Randy Waterhouse, and later reveals that the steps of the algorithm are intended to be carried out using a deck of playing cards.


Laws cannot stop encryption, they can stop law-abiding people from using it maybe but not criminals.

The ciphersaber idea is great, but the cipher that was chosen for the project has been subject to significant attacks (further developments of the ones mentioned on that page) and has been deprecated in TLS. (The attack setting for attacking RC4 in HTTPS is probably a lot easier than attacking short person-to-person communications, but it's still not a good sign.)

Also, when you implement the ciphersaber, you're still only about 1/4 of the way to the functionality of early-1990s PGP, notably lacking any public key functionality.

> For file encryption, a user need only memorize one key or passphrase. For messaging, users need to exchange pairs of keys through some secure means, most likely in person. Maintaining a list of correspondent's keys or passphrases in a master file, preferably itself encrypted with a memorized master key, is less convenient than public key encryption. But it may be all that is left in a few years if PGP key servers are banned.

> It may even be possible to teach a manual version of the Diffie-Hellman key exchange, perhaps using large number calculators (easily built in Java 1.1). The Diffie-Hellman procedure need be carried out just once per pair of correspondents, since CipherSaber eliminates the need to exchange keys for every message.

Apart from the implausibility of some of this, you have a very severe issue about key synchronization if you literally only want to do a key exchange once. For example, an attacker who can intercept one party's message and then trick another party into encrypting a known plaintext with the same key material (because that party doesn't know that the keystream has advanced yet?) can then decrypt the intercepted message.

Even having the two users accidentally use the same part of the keystream to send separate unknown messages m₁ and m₂ will allow an adversary to compute m₁⊕m₂, which is very bad in many cases. One thing I remember from Dan Boneh's cryptography class is that if either message contains an ASCII space character (' ') at some position, then m₁⊕m₂ will contain the other message's plaintext with uppercase and lowercase inverted (for example,' '^'q' is 'Q').

The ciphersaber idea is conceptually really great, and I love the idea of helping teach people to create their own communications and communications security infrastructure. But I think that, apart from just how archaic the cryptographic technology it teaches is, the project really underestimates how far away this cipher implementation is from a complete system.

I should probably also acknowledge that Arnold Reinhold apparently started writing this page in 1997, and so the techniques presented there weren't nearly as far from the state-of-the-art then!

Tony Ma's daughter better not go to any of the Five Eyes countries at this point.

For those of you searching for good E2E-encrypted messaging apps, Wire is really good. It has true cross-platform support without being tied to a phone number.

I know a lot of folks who are technical (and non-technical) and it was a big enough struggle to get them using Signal. I definitely see the advantages of Wire and not being tied to a phone number but I just don't know anyone using it. It's odd because some of those folks are using Keybase chat though... they just bulk at installing yet another messaging app (regardless of features).

I tried to get people to use Wire but it was buggy enough, and still is with most ignored, the majority of converts have abandoned it. I have continual problems. If it would have been reliable it could have been a killer app as it was basically Skype but encrypted and ostensibly available on all platforms. Having a stand alone (no linked phone) browser (angostic) based option is a great feature. But, it really feels they left the free/Personal version in the dust for their Pro/Enterprise option and from reading Github complaints and seeing a former employee discuss it in another forum I'd assume that to be accurate. Signal isn't much better from reading around and also still requires a phone linked and is only Chrome if you want to use it on the desktop (unless I am behind the times). Seems the space for an all in one encrypted communication/sharing tool available and accessible to all people and platforms has sort of died. I will never trust FB so even if people way smarter than me say Whatsapp is safe it still feels so dirty...and again the phone required bit.

Wire's "not tied to a phone number" part is good but constant delayed messages and lack of (useful) time stamps for each message (a la Signal) make the app pretty much unusable for me (and probably many others).

At some point, people are going to see the problem as not of a lack of privacy technologies, but of a small group who surveils them and exploits their personal information to keep them disadvantaged, and they will decide that this is the problem they need to solve.

It is also likely that the technology they use to solve that problem will be much less sophisticated.

Ballots, surely.

> Ballots, surely.

Well, one of the four boxes, anyway.

This will just drive people to more and more distributed platforms until there is nothing they can do.

“Creation, distribution, discussion, or any thought about encryption is illegal and will be punishable by up to life in prison under new anti-terrorism laws.” They will certainly try I am sure but at the end of the day encryption is just math and that is impossible to ban.

Thoughts eh?

Or could it be they already have a back door and stories like this serve them by making people believe they don’t.

By this logic, the fact that there were no stories about the government not being able to spy on messenger apps before, means that those apps were perfectly secure until they added E2E crypto. That doesn't sound right to me.

As we worry about 'Five Eyes' spying on us, we happily give Google and Facebook all our communications for free...

that is by choice. I have no choice about goverment spying on me.

In contemporary society you are usually quite the outcast when you say you don't do social media. You have the choice between being an outcast and being spied upon. Similarly for government spying in messenger apps: if you are an outcast who doesn't buy a smartphone or computer you are not affected.

I can't count the number of assholes I have avoided by not being on Facebook ;)

I can count the number of assholes I have met on Facebook. It's 0. I only put people I like in the first place on it, and surprise surprise, I enjoy using it.

It's not the tool people, it's how you're using it.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact